Question

I've been getting reports from some visitors of one of my web sites that it loads for them pretty slowly, but I can't recreate this slowness from any of my different machines or Internet connections.

I've pretty much eliminated the server itself as being responsible for the slowness, so it could be anything from hosting provider problems, to CDN problems for specific ISPs/countries, to even (theoretically) some type of so called "internet protection suite" that subjects my web site to long scrutiny on the local machine of some of my visitors.

Are there any tools / methods that you can recommend to gain some insights about these mysterious slownesses?

I guess I could use something like javascript code that would measure the load speed of each and every element in the page (including things like linked css/js files), and at the end of a complete load would post to the server the data. Then, I could collect the anomalous data and look for patterns in things like useragents, or, with registered users, I could contact them directly to try and get more hints. Is there perhaps some library that does something like that?

Was it helpful?

Solution

I'm sorry I have only found your question now. You are looking for Yahoo Boomerang. It does exactly as you describe, it measures page load time and beacons that data back to the server. It also includes a couple of plugins, the default one measures the user's available bandwidth to the server which could be very useful in the use case you described.

I was delighted when I found boomerang, so I'm glad to be able to share the joy. :-)

I recommend starting here. I made the mistake of downloading the file from downloads, which does not contain the bandwidth plugin. If you want bandwidth as well, you need to download the boomerang-0.9.1280532889.js file or create your own using the makefile.

The data is beaconed back as a GET request. I currently target that request at a 35byte GIF file so I can extract the data from the server logs. This is the boomerang author's recommendation at scale, but the beacon URL could equally be a PHP / other script to process the data.

OTHER TIPS

I don't know such library, but I suppose that it is not that much work to do it yourself.

By coding a wrapper called for every request, you can calculate each time request and store it for later stat request...

Fot now, you can execute Google Page Speed Online to optimize a bit more your JS/HTML/CSS/IMG ...

http://pagespeed.googlelabs.com/

There is. It is called analytics ;-)

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top