Question

I use python ftplib to upload binary file to remote ftp server, but it always transfers less than its actually size. ps: local env is windows, remote server is linux. I use:

    'ftp.storbinary('STOR %s' % filename,open(filename,'rb'))'

I think python's api read() is not work properly, it recognizes some special character as EOF which actually not.

How can I upload a binary file without lost bytes?

Was it helpful?

Solution 2

Several hard tests have been made before I understood why. First, this is not a bug in python. Because the file to be transferred hasn't been flushed to disk from memory,for this:

f.retrbinary('RETR '+filename, filehandler.write, bufsize) 

After doing this, file handler is not closed explicitly, retrbinarydoesn't close it, so storing it immediately will lose some bytes which are in memory. So if we close the file handler explicitly after storing it, like this:

f.retrbinary('RETR '+filename, filehandler.write, bufsize)
filehandler.close()

then we got all bytes, for more detail, see: 'http://blog.csdn.net/hongchangfirst'

OTHER TIPS

I actually just fought through this issue. I had to close the file that was being written before I opened it to upload to the FTP server.

out2 = open('file.csv')
for r1 in cursor:
    out2.write(str(r1))
out2.close()

ftp_census = file_loc
stor_census = str("STOR egocensus_" + demoFileDate + ".csv")
fc = open(ftp_census, 'rb')
ftp.storbinary(stor_census, fc, 1024)

Once I closed the file the file size on the FTP server was correct. I also edited the original answer to show the code better. I probably could code this better but it is working....

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top