문제

I have a Python script which connects to a remote FTP server and downloads a file. As the server I connect to is not very reliable, it often happens that the transfer stalls and transfer rates become extremely low. However, no error is raised, so that my script stalls as well.

I use the ftplib module, with the retrbinary function. I would like to be able to set a timeout value after which the download aborts, and then automatically retry/restart the transfer (resuming would be nice, but that's not strictly necessary, as the files are only ~300M).

도움이 되었습니까?

해결책

I managed what I need to do using the threading module:

 conn = FTP(hostname, timeout=60.)
 conn.set_pasv(True)
 conn.login()
 while True:
     localfile = open(local_filename, "wb")
     try:
         dlthread = threading.Thread(target=conn.retrbinary,
                 args=("RETR {0}".format(remote_filename), localfile.write))
         dlthread.start()
         dlthread.join(timeout=60.)
         if not dlthread.is_alive():
             break
         del dlthread
         print("download didn't complete within {timeout}s. "
                 "waiting for 10s ...".format(timeout=60))
         time.sleep(10)
         print("restarting thread")
     except KeyboardInterrupt:
         raise
     except:
         pass
     localfile.close()

다른 팁

What about the timeout argument of the FTP class http://docs.python.org/2/library/ftplib.html#ftplib.FTP

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top