Question

I was wondering if any one observed that the time taken to download or upload a file over ftp using Python's ftplib is very large as compared to performing FTP get/put over windows command prompt or using Perl's Net::FTP module.

I created a simple FTP client similar to http://code.activestate.com/recipes/521925-python-ftp-client/ but I am unable to attain the speed which I get when running FTP at the Windows DOS prompt or using perl. Is there something I am missing or is it a problem with the Python ftplib module.

I would really appreciate if you could throw some light as to why I am getting low throughput with Python.

Was it helpful?

Solution

The problem was with the block size, i was using a block size of 1024 which was too small. After increasing the block size to 250Kb the speeds are similar across all the different platforms.

def putfile(file=file, site=site, dir=dir, user=())
    upFile = open(file, 'rb')
    handle = ftplib.FTP(site)
    apply(handle.login, user)
    print "Upload started"
    handle.storbinary('STOR ' + file, upFile, 262144)
    print "Upload completed"
    handle.quit()
    upFile.close()

OTHER TIPS

I had a similar issue with the default blocksize of 8192 using FTP_TLS

site = 'ftp.siteurl.com'
user = 'username-here'
upass = 'supersecretpassword'
ftp = FTP_TLS(host=site, user=user, passwd=upass)
with open(newfilename, 'wb') as f:
    def callback(data):
        f.write(data)
    ftp.retrbinary('RETR filename.txt', callback, blocksize=262144)

Increasing the block size increased speed 10x. Thanks @Tanmoy Dube

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top