Question

Let's say I wanted to make a python script interface with a site like Twitter.

What would I use to do that? I'm used to using curl/wget from bash, but Python seems to be much nicer to use. What's the equivalent?

(This isn't Python run from a webserver, but run locally via the command line)

Was it helpful?

Solution

For something like Twitter, you'll save yourself a ton of time by not reinventing the wheel. Try a library like python-twitter. This way, you can write your script, or even a full fledged application, that interfaces with Twitter, and you don't have to care about the implementation details.

If you want to roll your own interface library, you're going to have to get familiar with urllib and depending on what format they provide results, either lxml (or some other xml parser) or simplejson.

OTHER TIPS

Python has urllib2, which is extensible library for opening URLs

Full-featured easy to use library.

https://docs.python.org/library/urllib2.html

I wholeheartedly recommend mechanize for python. It's exactly a programmable web browser that you can use from python, which handles forms and cookies as well! It makes any kind of site crawling a breeze.

Take a look at the examples on that link to see what it can do.

Python has a very nice httplib module as well as a url module which together will probably accomplish most of what you need (at least with regards to wget functionality).

If you're used to dealing with cURL, consider PycURL.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top