Question

I have an existing php website, that is written in an old fashion way.

Every page has the same pattern

<?php
require_once("config.php");
//Some code
require_once("header.php");
?>
Some HTML and PHP code mixture
<?php
require_once("footer.php");
?>

Where all the db connection, session data, language files are initiated at the "config.php" file. And every DB access is done with a mysql_query call,

No OOP what-so-ever, purely procedural programming.

How would you to optimize this code structure in order to improve performance and make this website robust enough to handle heavy traffic ?

Was it helpful?

Solution

How would you to optimize this code structure in order to improve performance and make this website robust enough to handle heavy traffic ?

The structure you've shown us has very little scope for for optimization. Making it object-oriented will make it slower than it currently is. Note that the code within the included files may benefit greatly from various changes.

There's only 3 lines of code here. So not a lot of scope for tuning. The _once constructs add a (tiny) overhead, as does use of require rather than include - but this is a very small amount compared to what's likely to be happening in the code you've not shown us.

Where all the db connection, session data, language files are initiated at the "config.php" file

There are again marginal savings by delaying access to external resources until they are needed (and surrendering access immediately when they are no longer required) - but this is not an OO vs procedural issue.

Why will OO be slower?

Here the routing of requests is implemented via the the webserver - webservers are really good at this and usually very, very efficient. The alternative approach of using a front controller gives some scope for applying templating in a more manageable way and for applying late patching of the content (although it's arguable if this is a good idea). But using a front-controller pattern is not a requirement for OO.

Potentially, when written as OO code, redundant memory allocations hang around for longer - hence the runtime tends to have a larger memory footprint.

Overriding and decorating adds additional abstraction and processing overhead to the invocation of data transformations.

every DB access is done with a mysql_query call

There's several parts to this point. Yes, the mysql_ extension is deprecated and you should be looking to migrate this as a priority (sorry, but I can't recommend any good forward/backward shims) however it is mostly faster than the mysqlnd engine - the latter has a definite performance advantage with large datasets / high volume due to reduced memory load.

You seem to be convinced that there's something inherently wrong with procedural programming with regard to performance and scale. Nothing could be further from the truth. G-Wan, Apache httpd, Varnish, ATS, the Linux kernel are all written in C - not C++.

If you want to improve performance and scalability, then you're currently looking in the wrong place. And the only way to make significant in-roads is to to profile your code under relevant load levels.

OTHER TIPS

If you really don't want to change your structure (OOP and mysql_*..., and even with these in fact), you should implement a cache system, to avoid the generation of content everytime. If you have some data that doesn't change often (like blog post, news or member profile), you can create a cache of 5 minutes on it, to lighten the SQL use. Google might help you for this.

There's also a lot of web techniques to optimize pages loading: use a CDN to load your static ressources, Varnish cache...

With PHP itself, there's also some methods, but a lot of blog post exists about that, just look here for example :) For example:

  • Avoid regex if possible
  • Initialize variable if you need it: it's 10 times slower to increment/decrement an non-initialized variable (which is ugly btw)
  • Don't call functions in for declaration, make temp variable instead
  • etc.

Don't hesitate to benchmark, make some tests with JMeter to simulate a pool of connections and see which page is slow and what you should optimize first.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top