Question

Our website relies on images from one of our manufacturers. The image directories are massive and getting them via FTP is an all day job. Now that we've downloaded the entire directory, we'd like to be able to periodically download files and directories that are new, or have been changed since the last time we downloaded them. We're thinking about writing a script that checks the modification date of files and only downloads the latest versions.

Since this can't be the first time this problem has been encountered or solved, I thought I'd post this and see if anyone knows of existing solutions that can be applied here. An existing solution would need to be compatible with FreeBSD and/or LAMP.

Was it helpful?

Solution

with wput

OTHER TIPS

Is there any reason you can't use rsync?

As user77413 noted in another comment, this should work...

wget --mirror username:password@siteurl.com/path

The default number of retries is 20, you can increase this with --tries 100

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top