Question

I have two instances of app engine applications running that I want to communicate with a Restful interface. Once the data of one is updated, it calls a web hook on the second which will retrieve a fresh copy of the data for it's own system. Inside 'site1' i have:

 from google.appengine.api import urlfetch

 url = www.site2.com/data_updated
 result = urlfetch.fetch(url)

Inside the handler for data_updated on 'site2' I have:

 url = www.site1.com/get_new_data
 result = urlfetch.fetch(url)

There is very little data being passed between the two sites but I receive the following error. I've tried increasing the deadline to 10 seconds but this still doesn't work.

 DeadlineExceededError: ApplicationError: 5 

Can anyone provide any insight into what might be happening?

Thanks - Richard

Was it helpful?

Solution 2

Changing the method from

  result = urlfetch.fetch(url)

to

  result = urlfetch(url,deadline=2,method=urlfetch.POST)

has fixed the Deadline errors.

From the urlfetch documentation:

deadline The maximum amount of time to wait for a response from the remote host, as a number of seconds. If the remote host does not respond in this amount of time, a DownloadError is raised.

Time spent waiting for a request does not count toward the CPU quota for the request. It does count toward the request timer. If the app request timer expires before the URL Fetch call returns, the call is canceled.

The deadline can be up to a maximum of 60 seconds for request handlers and 10 minutes for tasks queue and cron job handlers. If deadline is None, the deadline is set to 5 seconds.

OTHER TIPS

App Engine's urlfetch doesn't always behave as it is expected, you have about 10 seconds to fetch the URL. Assuming the URL you're trying to fetch is up and running, you should be able to catch the DeadlineExceededError by calling from google.appengine.runtime import apiproxy_errors and then wrapping the urlfetch call within a try/except block using except apiproxy_errors.DeadlineExceededError:.

Relevant answer here.

Have you tried manually querying the URLs (www.site2.com/data_updated and www.site1.com/get_new_data) with curl or otherwise to make sure that they're responding within the time limit? Even if the amount of data that needs to be transferred is small, maybe there's a problem with the handler that's causing a delay in returning the results.

The amount of data being transferred is not the problem here, the latency is.

If the app you are talking to is often taking > 10 secs to respond, you will have to use a "proxy callback" server on another cloud platform (EC2, etc.) If you can hold off for a while the new backend instances are supposed to relax the urlfetch time limits somewhat.

If the average response time is < 10 secs, and only a relatively few are failing, just retry a few times. I hope for your sake the calls are idempotent (i.e. so that a retry doesn't have adverse effects). If not, you might be able to roll your own layer on top - it's a bit painful but it works ok, it's what we do.

J

The GAE doc now states the deadline can be 60 sec:

result = urlfetch(url,deadline=60,method=urlfetch.POST)
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top