Question

I wrote a little tool, that gathers data from facebook, using api. Tool uses multiprocessing, queues and httplib modules. Here, is a part of code:

main process:

def extract_and_save(args):
    put_queue = JoinableQueue()
    get_queue = Queue()

    for index in range(args.number_of_processes):
        process_name = u"facebook_worker-%s" % index
        grabber = FacebookGrabber(get_queue=put_queue, put_queue=get_queue, name=process_name)
        grabber.start()

    friend_list = get_user_friends(args.default_user_id, ["id"])
    for index, friend_id in enumerate(friend_list):
        put_queue.put(friend_id)

    put_queue.join()
    if not get_queue.empty():
        ... save to database ...
    else:
        logger.info(u"There is no data to save")

worker process:

class FacebookGrabber(Process):
    def __init__(self, *args, **kwargs):
        self.connection = httplib.HTTPSConnection("graph.facebook.com", timeout=2)
        self.get_queue = kwargs.pop("get_queue")
        self.put_queue = kwargs.pop("put_queue")
        super(FacebookGrabber, self).__init__(*args, **kwargs)
        self.daemon = True

    def run(self):
        while True:
            friend_id = self.get_queue.get(block=True)
            try:
                friend_obj = self.get_friend_obj(friend_id)
            except Exception, e:
                logger.info(u"Friend id %s: facebook responded with an error (%s)", friend_id, e)
            else:
                if friend_obj:
                    self.put_queue.put(friend_obj)
            self.get_queue.task_done()

common code:

def get_json_from_facebook(connection, url, kwargs=None):
    url_parts = list(urlparse.urlparse(url))
    query = dict(urlparse.parse_qsl(url_parts[4]))
    if kwargs:
        query.update(kwargs)
    url_parts[4] = urllib.urlencode(query)
    url = urlparse.urlunparse(url_parts)
    try:
        connection.request("GET", url)
    except Exception, e:
        print "<<<", e

    response = connection.getresponse()
    data = json.load(response)
    return data

This code perfectly works on Ubuntu. But when I tried to run it on Windows 7 I got message "There is no data to save". The problem is here:

try:
    connection.request("GET", url)
except Exception, e:
    print "<<<", e

I get next error: <<< a float is required

Do anybody know, how to fix this problem?

Python version: 2.7.5

Was it helpful?

Solution

One of the "gotcha's" that occasionally happens with socket timeout values is that most operating systems expect them as floats. I believe this has been accounted for with later versions of the linux kernel.

Try changing: self.connection = httplib.HTTPSConnection("graph.facebook.com", timeout=2) to: self.connection = httplib.HTTPSConnection("graph.facebook.com", timeout=2.0)

That's 2 seconds, by the way. Default is typically 5 seconds. Might be a little low.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top