I managed to solve my similar problem. I'm not sure if it's help for you but I decided to document it here anyway in case it helps someone.
In my case I was analyzing huge amount of tweets (52000 in total) by dividing them to multiple processors. It worked fine on OSX and on server, but on my Windows 8.1 it was really slow and processes were activated sequentially. By looking into task-manager I noticed that the main Python process' memory usage went up and up to around 1.5Gb. The worker process' memory usage climbed similarly. Now I noticed that my older version worked fine which had slightly different algorithm. In the end the problem was that I retrieved whole tweets from database while I required only the text part of the tweets. This apparently led to grown memory usage. After I fixed that part, the program launched worker processes properly.
So based on my experience I have a hunch that Windows tries to control the ram usage by blocking the worker processes. If so, check the ram usage of your processes. This is just speculation on my part, so I'm interested if someone has better explanation.