Question

I'm using python and httplib to implement a really simple file uploader for my file sharing server. Files are chunked and uploaded one chunk at a time if they are larger than 1MB. The network connection between my client and server is quite good (100mbps, <3ms latency).

When chunk size is small (below 128kB or so), everything works fine (>200kB/s). But when I increase the chunk size to 256kB or above, it takes about 10 times more time to complete a chunk comparing to 128kB chunking (<20kB/s). To make the thing even stranger, this only happens in my win32 machine (win8 x86, running 32b python) but not in my amd64 one (win8 amd64, running 64b python).

After some profilings, I've narrowed down my search to request() and getresponse() functions of httplib.HttpConnection, as these are the cause of blocking.

My first guess is something about socket buffering. But changing SO_SNDBUF and TCP_NODELAY options does not help much. I've also checked my server side, but everything's normal.

I really hope someone can help me out here. Changing the http library (to pycurl) is the last thing I want to do. Thanks in advance!

Was it helpful?

Solution

Turns out it's a VM related problem. I was running my Python code on a VM, but when I copy the same code to a physical machinse running the same Windows edition, the problem disappears.

As I'm totally unfamiliar with VM mechanisms, it would be great if someone can explain why such a problem exists in VM.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top