Question

I'm trying to get a Facebook Like count for a subset of pages on my site without hitting Facebook's API limits.

If I was to do this for 1000+ URLs in one request the generated URL is huge (hitting the HTTP Get command limit as mentioned in fql like count limit) and Open Graph times out.

Calling Open Graph with FQL once for every URL I want to index is fast and reliable, until the API limit of 600 requests per 600 seconds is hit.

Is there a way to get this data for, say, 10,000 URLs in 1 hour without hitting API limits?

No correct solution

OTHER TIPS

Build your query's so that the urls don't exceed the 2048 characters limit. Assuming that the average length of an url is 90 characters you can get around 20 urls in each query, so with 600 api calls in an hour with 20 urls each you can query around 12000 urls.

Maybe you can estimate your urls average length according to your database and break the urls according to this method.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top