It would be a bad idea to send a unique command to push more than a few thousands of items. It would saturate the communication buffers, and a large command will block all other concurrent commands due to the single-threaded nature of Redis.
I suggest to build your push commands by batch of small packets of n items (n between 10 and 100), and to group your push commands in a pipeline of m commands (m between 10 and 100).
The algorithm would be something like this:
While there are still lines to read:
New Redis pipeline, i=0
While there are still lines to read and i<m:
Read at most n lines
Build push command for the read lines
Pipeline push command
++i
Flush Redis pipeline, check return status if needed
It will only generate N / (n*m) roundtrips (N being the number of lines in the input file).
To build commands with arbitrary numbers of parameters, you can use the redisAppendCommandArgv function.