There are multiple ways to do what you want:
- Create a dedicated
gevent
thread, and explicitly dispatch all of your URL-opening jobs to that thread, which will then do the geventedurlopen
requests. - Use threads instead of greenlets. Running 50 threads isn't going to tax any modern OS.
- Use a thread pool and a queue. There's usually not much advantage to doing 50 downloads at the same time instead of, say, 8 at a time (as your browser probably does).
- Use a different async framework instead of
gevent
, one that doesn't work by magically greenletifying your code. - Use a library that has its own non-magic async support, like
pycurl
. - Instead of mixing and matching incompatible frameworks, build the server around
gevent
too, or find some other framework that works for both your web-serving and your web-client needs.
You could simulate the last one without changing frameworks by loading gevent
first, and have it monkeypatch your threads, forcing your existing threaded server framework to become a gevent
server. But this may not work, or mostly work but occasionally fail, or work but be much slower… Really, using a framework designed to be gevent
-friendly (or at least greenlet-friendly) is a much better idea, if that's the way you want to go.
You mentioned that others had recommended requests
. The reason you can't find the documentation is that the built-in async code in requests
was removed. See, an older version for how it was used. It's now available as a separate library, grequests
. However, it works by implicitly wrapping requests
with gevent
, so it will have exactly the same issues as doing so yourself.
(There are other reasons to use requests
instead of urllib2
, and if you want to gevent
it it's easier to use grequests
than to do it yourself.)