Question

I'm doing some PHP work recently, and in all the code I've seen, people tend to use few methods. (They also tend to use few variables, but that's another issue.) I was wondering why this is, and I found this note "A function call with one parameter and an empty function body takes about the same time as doing 7-8 $localvar++ operations. A similar method call is of course about 15 $localvar++ operations" here.

Is this true, even when the PHP page has been compiled and cached? Should I avoid using methods as much as possible for efficiency? I like to write well-organized, human-readable code with methods wherever a code block would be repeated. If it is necessary to write flat code without methods, are there any programs that will "inline" method bodies? That way I could write nice code and then ugly it up before deployment.

By the way, the code I've been looking at is from the Joomla 1.5 core and several WordPress plugins, so I assume they are people who know what they're doing.

Note: I'm pleased that everyone has jumped on this question to talk about optimization in general, but in fact we're talking about optimization in interpreted languages. At least some hint of the fact that we're talking about PHP would be nice.

Was it helpful?

Solution

I think Joomla and Wordpress are not the greatest examples of good PHP code, with no offense. I have nothing personal against the people working on it and it's great how they enable people to have a website/blog and I know that a lot of people spend all their free time on either of those projects but the code quality is rather poor (with no offense).

Review security announcements over the past year if you don't believe me; also assuming you are looking for performance from either of the two, their code does not excel there either. So it's by no means good code, but Wordpress and Joomla both excel on the frontend - pretty easy to use, people get a website and can do stuff.

And that's why they are so successful, people don't select them based on code quality but on what they enabled them to do.

To answer your performance question, yes, it's true that all the good stuff (functions, classes, etc.) slow your application down. So I guess if your application/script is all in one file, so be it. Feel free to write bad PHP code then.

As soon as you expand and start to duplicate code, you should consider the trade off (in speed) which writing maintainable code brings along. :-)

IMHO this trade off is rather small because of two things:

  1. CPU is cheap.
  2. Developers are not cheap.

When you need to go back into your code in six months from now, think if those nano seconds saved running it, still add up when you need to fix a nasty bug (three or four times, because of duplicated code).

You can do all sorts of things to make PHP run faster. Generally people recommend a cache, such as APC. APC is really awesome. It runs all sorts of optimizations in the background for you, e.g. caching the bytecode of a PHP file and also provides you with functions in userland to save data.

So for example if you parse a configuration file each time you run that script disk i/o is really critical. With a simple apc_store() and apc_fetch() you can store the parsed configuration file either in a file-based or a memory-based (RAM) cache and retrieve it from there until the cache expired or is deleted.

APC is not the only cache, of course.

OTHER TIPS

How much "efficiency" do you need? Have you even measured? Premature optimization is the root of all evil, and optimization without measurement is ALWAYS premature.

Remember also the rules of Optimization Club.

  1. The first rule of Optimization Club is, you do not Optimize.
  2. The second rule of Optimization Club is, you do not Optimize without measuring.
  3. If your app is running faster than the underlying transport protocol, the optimization is over.
  4. One factor at a time.
  5. No marketroids, no marketroid schedules.
  6. Testing will go on as long as it has to.
  7. If this is your first night at Optimization Club, you have to write a test case.

You should see the responses to this question: Should a developer aim for readability or performance first?

To summarize the consensus: Unless you know for a fact (through testing/profiling) that your performance needs to be addressed in some specific area, readability is far more important.

In 99% of the cases, you should better worry about code understandability. Write code easy to test, understand and mantain.

In those few cases where performance really is critical, scripting languages like PHP are not your best choice. There's a reason many base library functions in PHP are written in C, after all.

Personally, while there may be overhead for a function call, if it means I write the code once (parameterized), and then use it in 85 places, I'm WAY further ahead because I can fix it in one place.

Scripting languages tend to give people the idea that "good enough" and "works" are the only criteria to consider when coding.

Especially with a fast interpreter like PHP's, I don't think lack of readability/maintainability is EVER worth the efficiency you may (or may not!) gain from it.

And a note about WordPress: I've done a lot of browsing of the WordPress code. Don't assume those people know anything about good code, please.

To answer your first question, yes it is true and it is also true for compiled op-code. Yes you can make your code faster by avoiding function calls except in extreme cases where your code grows too large because of code duplication.

You should do what you like "I like to write well-organized, human-readable code with methods wherever a code block would be repeated."

If your going to commit this horrible atrocity of removing all function calls at least use a profiler and only do it to the 10% of your code that matters.

An example of how micro-optimization leads to macro slowdowns:

If you're seriously considering manually inlining functions, consider manually unrolling loops.

JMPs are expensive, and if you can eliminate loops by unrolling and also eliminate all conditional blocks, you'll eliminate all that time wasted merely seeking around the CPU's cache.

Variable augmentation at runtime is slow too, as is pulling things out of a database, so you should inline all that data into your code as well.

Actually, loading up an interpreter for merely executing code and copying memory out to a user is exhaustively wasteful, why don't we just pre-compute all the possible pages and store each page in memory ready to go so its just a mem-copy? surely thats fast!

Ah, now we've got that slow thing called the internet between us, which is hindering user experience and limiting how much content we can use, how about we pre-compute the pages in advance, and archive them all and run them on the users local machine? that'll be really fast!

But that's going to waste cpu cycles, lots of them, what with page load time and browser content rendering etc, we'll skip the middleman and just deliver the pages to them on printed media!. Genius!.

/me watches your company collapse on its face while you spend 10 years precomputing (by hand) and printing pages nobody wants to see.

This may sound silly to you, but to the rest of us, what you proposed is just that ridiculous.

Optimisation is good, but draw the line somewhere sensible so you don't have to worry about future people whom work on the code tracking you down in your sleep for having such a crappy codebase thats unmaintainable.

note: yes, I use gentoo. how did you guess?

Of course you shouldn't write bad PHP code. But once you have something written bad, you may always use perfomance as an excuse :-)

This is premature optimization. While the statement is true that a function call costs more than increasing a local integer variable (nearly everything costs more), the costs of a function call are still very low compared to a database query.

See also:

Wikipedia -> Optimization -> When to optimize

c2.com Wiki -> Premature Optimization

PHP's main strength is that it's quick and easy to get a working app. That strength comes from the opportunity to write loose (bad) code and have it still operate in a somewhat expected way.

If you are in a position to need to conserve a few CPU cycles, PHP is not what you should be using. When PHP web apps perform poorly, it is far more likely due to inefficient queries, not the speed of the code execution.

If you're that worried about every bit on efficiency, then why on earth are you using a scripting language? You should be programming in a much faster language (insert your favorite compiled language here), probably resulting in more, and less readable code, but it'll run really fast, and you can still aim for best coding practices.

Seriously, if you're coding for running speed, you shouldn't be using PHP at all.

If you develop web applications with a MVC architectural pattern, you can greatly benefit from caching and serialization. You can cache views, or portions of it, and you can serialize models.

From experience, models often parse and generate most of the data that's being displayed. If you know a certain model won't be generating new data frequently, like a model that parses an RSS feed, you can just have it stuffed somewhere with all the parsed data and have it refreshed every once in a while.

If you look at wordpress php code, it intermingles php tags in between its html which leads to spaghetti in my mind.

Phpbb3 however is way better in that regard. For example it has a strict division between the php part, and the styles part, which are xhtml formatted files with {template} tags, parsed by a template engine. Which is much cleaner.

Write a couple 10 minute examples and run them in your profiler.

That will tell you which is faster to the millisecond.

If you don't have a profiler, post them here, and I will run them in my PHPEd profiler.

I suspect that much of the time difference, if any, comes from having to open the file that a class is stored in, but that would have to be tested too.

Then ask yourself if you care that much about a few milliseconds vs having to maintain spaghetti code - will any of your users ever notice?

Edit

The profiler won't simulate high traffic volumes, but it will tell you which method is faster for a single user, and which parts of the code are using how much time. Especially if you profile the operations being done repeatedly - say 1000 times each in a loop.

We can assume (though not always) that faster code used by a lot of people will be faster than slower code used by a lot of people.

Those who will lecture you about code micro-optimization are generally the same ones which will have 50 SQL queries per page, taking up a total of 2 seconds, because they never heard about profiling. But their code is optimizized !!! (and slow as hell)

Fact : adding another webserver is not difficult. Replicating a database is. Optimizing webserver code can be a net loss if it adds load on the DB.

Note : 2-3 ms for simple pages (like a forum topic) including SQL is a good target for a PHP website. My old website used to do that.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top