Question

I need to move lots of files from one server to another. All the files are images and fairly small but in total they are 10GB+.

I have attempted to use tar and gzip, but this fails because of 'too many arguments'.

How can I get around the 'too many arguments' issue?

Is there another way perhaps - could SCP or rsync do this?

Both boxes are Ubuntu.

Edit

The following seems to just hang and nothing happens, the images.tar.gz file is of size zero after the command is quit.

tar -zcvf images.tar.gz images
Était-ce utile?

La solution

-T - will read the file names from stdin. So you can do something like:

find . -name \*.jpg -print0 | tar -zcvf images.tar.gz --null -T -

However I would recommend rsync instead, as I noted in the comments.

As noted in the comments, the print0 uses nulls ('\0') to terminate file names and --null for tar to use that, in order to accept more general file names with spaces and other terminators.

Autres conseils

Try cpio, it reads the list of files from stdin. You can pipe the output through ssh:

find images -print0 | cpio -o0x ustar | ssh user@host cpio -i

Or with compression

find images -print0 | cpio -o0x ustar | gzip | ssh user@host 'gunzip  | cpio -i'

Update

Actually this is answer won't work as GNU cpio only implements archive formats with limited size. Answer has been updated to use the ustar format, which is more than the 2 GiB limit of the default one, but still is only 8 GiB. Mark Addler's tar answer is probably the best way to go for larger amounts of data.

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top