Question

I have a Python script which connects to a remote FTP server and downloads a file. As the server I connect to is not very reliable, it often happens that the transfer stalls and transfer rates become extremely low. However, no error is raised, so that my script stalls as well.

I use the ftplib module, with the retrbinary function. I would like to be able to set a timeout value after which the download aborts, and then automatically retry/restart the transfer (resuming would be nice, but that's not strictly necessary, as the files are only ~300M).

Was it helpful?

Solution

I managed what I need to do using the threading module:

 conn = FTP(hostname, timeout=60.)
 conn.set_pasv(True)
 conn.login()
 while True:
     localfile = open(local_filename, "wb")
     try:
         dlthread = threading.Thread(target=conn.retrbinary,
                 args=("RETR {0}".format(remote_filename), localfile.write))
         dlthread.start()
         dlthread.join(timeout=60.)
         if not dlthread.is_alive():
             break
         del dlthread
         print("download didn't complete within {timeout}s. "
                 "waiting for 10s ...".format(timeout=60))
         time.sleep(10)
         print("restarting thread")
     except KeyboardInterrupt:
         raise
     except:
         pass
     localfile.close()

OTHER TIPS

What about the timeout argument of the FTP class http://docs.python.org/2/library/ftplib.html#ftplib.FTP

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top