문제

I improved a lot my code and now all the API run really fast, I also added memcache and I have a great hit ratio .. But sometimes I get meaningless delays.

I attached here the most significant appstats screenshot: more than 20 seconds in total to run 90ms of RPCs; how is it possible? Where should I look to find the origin of those delays?

I am really stuck because I don't understand what's happening between the RPCs and I don't know what else I can do in order to get more informations.

Just a thought: each HTTP call is handled by the same GAE instance, right? Because my instances took a lot of time to warmup .. But I don't think it is related

BTW: I am coding in Java.

appstats statistics

도움이 되었습니까?

해결책

Usually the unaccounted for "hole" in the middle of appstats is your code executing.
Appstats records every rpc entry and exits and the areas he cannot record are your actual code running.

Do you have logs for the time in which the application was between those two calls?

다른 팁

Huge, 'unexplained' latency is almost always warmup requests gobbling up resources. Inspect your appengine logs to see how much api_ms and cpu_ms are being used on warmups.

You can avoid warmups by increasing your maximum pending latency in appengine control panel. Allowing higher latency means requests will wait longer before firing a new instance. This could make each request a little slower, but you will avoid heavyweight loading requests.

To help with warmup requests, make sure your appengine-web.xml has:

<warmup-requests-enabled>true</warmup-requests-enabled>  

This will cause the appengine dispatcher to preemptively fire up new instances when the current ones are being overloaded {i.e. it starts loading before a request goes to the new instance}.

then, in the affected slow servlets, make sure you put load-on-startup in your web.xml:

<servlet>
  <servlet-name>my-servlet</servlet-name>
  <servlet-class>com.company.MyServlet</servlet-class>
  <load-on-startup>1</load-on-startup>
</servlet>

load-on-startup merely ensures that your high-priority servlets are always ready to go as soon as the warmup request finishes.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top