Question

Instead of starting new instances of a PHP script when an HTTP request is received, is there any way for one PHP script to handle multiple requests?

Was it helpful?

Solution

Haven't seen an implementation for http requests for that. All I've been able to accomplish is that you wait for all the requests to come back. You could do this in command line by forking the process and sending it to the background. Or you could utilize Gearman (distributed work) for that.

OTHER TIPS

PHP is built around the "Share Nothing" concept which gives you the ability to load balance and scale an application better by having a distributed network. So "no" this cant be done. If you imagine the initiation costs are high then maybe adjust the architecture to conceptually 'cache' your objects/data/views as much as you can. Use serialize() or something.

If you make the file an HTTP server and run it as a process, then yes.

If it gets ran through Apache and mod_php, no.

(why on earth would you want that anyway?)

As far as I know, there isn't a way to do that. Closest thing I can think of is using a php opcode cache like (xcache or APC). Those will cache the code for faster script execution. I believe every request will have its own instance of the script.

What you want is to cache data.

Your php script should simply check to see if there are valid data for the request in the cache. If not, then do your database read, update the cache, and return the results to the user.

I would suggest looking into various caching libraries and carefully considering how you will scale your cache. One place to start is Zend_Cache, possibly with memcached on the back-end.

The scripts that handle the HTTP requests can get the data from a small PHP daemon using sockets.

Here is a useful library for PHP daemons: http://github.com/kvz/system_daemon

And some documentation:

http://kevin.vanzonneveld.net/techblog/article/create_daemons_in_php/

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top