Question

I am trying to figure out the efficiency of my server side code.

Using microtime(true) to measure speed, I am able to calculate the time it took my script to run.

I am getting average speeds of .3 to .5 seconds. These scripts do a number of database queries to return different values to the user.

What is considered an efficient execution time for PHP scripts that will be run online for a website?

I know it depends on exactly what is being done, but just consider this a standard script that reads from a database and returns values to the user. I look at Google and see them search the internet in .15 seconds and I feel like my script is crap.

Was it helpful?

Solution

YouTube's target page rendering time is < 100ms (Video here @7:00).

Your bottleneck is probably DB queries - try using

EXPLAIN select * from x...

to see if you can add indexes that will speed up your queries.

edit the link above has died. High Scalability did a feature on YouTube that used that video as its primary source, so it may be of some interest: http://highscalability.com/youtube-architecture

OTHER TIPS

Seems a bit high to me.

For reference, my framework that I've created is getting execution times as low as .0028 and as high as .0340 seconds. On average, each page typically has between 11 - 18 SQL queries.

However, keep in mind that this is a highly optimized framework, taking advantage of caching, very careful coding of queries, and autoloading. Try implementing those tactics, and you should see a big improvement.

With PHP, most websites are generated so fast that the biggest delay is the page rendering, including all sub-requests such as images to display.
But also the visitor's internet connection speed and quality, his computer and his software are important factors.

To be generous, let's say PHP takes 20% of the total load & rendering time of a web page.
And again, I know this percentage is very approximative, but it's more for an illustrative example.

An average page loading time is around 3 seconds. (which is too much) Good quality websites should take about 1 second to be fully loaded, so PHP will be allowed 200ms (20% of 1 sec) to generate the output. So php could take up to 600ms for an "average" website.

Note: The PHP execution time can be improved by changing your hoster, or by improving your source code.

Hum... I'm not sure an absolute value is quite fair here. It really depends on the hardware... When I develop locally, my developper machine runs something like 5-10 times slower than the actuel server. So if we take an absolute value, the "acceptable" range would vary depending on the hardware.

Well, generally I try to keep things below 100 ms. If the server load time is higher, I'll trace the execution and try to figure out what's wrong. I have to say most of the time, the databse (hence the queries) are the bottleneck. A real work on that is really important.

Does it need to be faster and why?

If the answer is "Yes, because it's on the requirements list" or "Because it takes valuable server resources", then try to optimize your SQL queries. Maybe you need to add index(es)...

Otherwise, I think you should move on to the next task. First, it works and second you are talking about .3 to .5 sec. which should be fast enough for humans and machines.

This is of course very subjective, depends on the site, etc. etc.

However, I'd say that when a page starts taking longer than around 100 milliseconds, it's a noticeable delay for the user, and that might be "too long". That's if this is a page that can reasonably be expected to load instantaneously. If the page is a search page, doing a fulltext search in a large database, the situation is of course different.

I'd say 10 times less would be ok. The number of queries doesn't matter though. There can be 20 of them, all running for 0.005 sec. The quality does matter, not the quantity. Profile your code to determine most slow parts, by adding some more microtime statements, find a most slow part and then optimize it.

If you have your own function for query mysql, to place microtime stuff there would be very handy

Depends, like stated, but in addition consider this: with a one second execution time, you will be able to serve (under ideal conditions) only one request per second on a server machine with one CPU and where there is nothing else happening on that machine. If you have more requests than one per second coming in, you will get a long queue, and your server will run flat out, causing incoming requests to take even longer to process. If you get less requests you still need to pay attention to your CPU utilization. If the server is heavily loaded already before, you may have a problem that need to be attended to.

There are mathematical methods (queueing theory) that can be used to analyze capacity requirements, see for example PDQ (http://www.perfdynamics.com/Tools/PDQ.html) for more.

Comparing to Google may not hence be fair, since they must have massive amounts of incoming requests, and with 3 times longer execution time they would need several times more servers than they already have...

Aim for < 200ms.

People increasingly start losing patience for tings that take > 200ms.

It's all relative, really. Don't expect to get times on par with other sites using PHP. Remember, PHP needs to load everything from scratch on each page load.

Really you want to see how well your site does under load, like using Apache ab to test it. If your site can handle the highest traffic level you can expect, then you don't need to optimize it anymore. A user isn't going to be able to tell if your page loads in .75 seconds or .25 seconds.

Remember, calling microtime itself is going to add time to your page load since it has to make a call out to the operating system (context switch). It may be more valuable to optimize the page, making it smaller so it goes over the network faster and renders faster once on the client.

I notice that, as an editor on Wikipedia, we don't see complaints until page load initiation exceeds somewhere around 5 to 10 seconds. Of course the mechanism for reporting such slowness is obscure to most users.

For myself—as a user of travel websites—I am adequately placated by an intermediate screen which says "Got your request. It's now processing. It might take up to X seconds."

  • I wouldn't compare your script with Google, unless you're maintaining a similar page-ranking.. etc.

  • If the search merely retrieves values from a database the speed may be improved some by profiling the application and eliminating bottle-necks (to mention a few - script on page, large images, large tables, database indexes)

I have made a WP site running rather heavy search & aggregated statistics gathering procedure on one of its pages.

The respective PHP module gets roughly 500 records from the DB and then decodes each one requesting DB for more details for say 10 different custom fields to aggregate all the required complimentary data for the front-end.

Then it returns only a page of data (say 20 items) and the statistics info (another 10 items).

Looking at my Chrome dev tools timing

  • my local dev machine needs ~0.9 - 1.1 sec for TTFB for each XHR request which initiates that backend procedure described above.
  • the real web server on a pretty good shared hosting for the same page requires 0.3-0.6 sec for TTFB.

Obviously this is not clean DB performance figure, Apache and PHP<->MySQL gets in the way of timing, but as ballpark figure it is well can be used. No SSL handshakes etc. connection overheads is presented here. Just the time server took to prepare the data and respond, no page reload as well as it is being an XHR request.

If the site under question in a peak hour has 100 visitors / hour visiting that same DB-heavy page in a peak hour, it is only ~2 users / minute, so with your 0.5 sec / visitor request you could serve 60 visitors / minute and call it the bottleneck. Obviously the spare performance capacity is huge in this imaginary case.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top