Question

FastCGI servers, for example, impose an execution time limit on PHP scripts which cannot be altered using set_time_limit() in PHP. IIS does this too I believe.

I wrote an import script for a PHP application that works well under mod_php but fails under FastCGI (mod_fcgid) because the script is killed after a certain number of seconds. I don't yet know of a way of detecting what your time limit is in this case, and haven't decided how I'm going to get around it. Doing it in small chunks with redirects seems like one kludge, but how?

What techniques would you use when coding a long-running task such as an import or export task, where an individual PHP script may be terminated by the server after a certain number of seconds?

Please assume you're creating a portable script, so you don't necessarily know whether PHP will eventually be run under mod_php, FastCGI or IIS or whether a maximum execution time is enforced at the server level. That probably also rules out shell-scripts, etc.

Was it helpful?

Solution

Use the PHP command line interface which is not subject to script time limits imposed by web servers. If you need to automate execution of your script, you can schedule it with cron.

OTHER TIPS

What you're really talking about is job queuing. That is the practice of running PHP code asynchronously from the front end request. There are two primary ways of doing it in PHP. One is to use a program called Gearman the other is to use the Zend Server Job Queue, which I personally am more familiar with. I have a blog post on how you can do it called Do you Queue. I have found that the implementation I have there is immensely easy to use.

What you might also want to try is to set max_execution_time to 0 prior to executing your logic.

Doing it in small chunks with redirects seems like one kludge, but how?

That's exactly how I handled a full forum database backup (phpBB) when the built-in export mechanism started hitting the max_execution_time limit.

I did it one table at a time, and for the big tables in chunks of 5000 rows. (It turned out that the limiting factor in the whole process wasn't the execution time on the export, but actually the file size that phpmyadmin could handle on the import.)

After each chunk of exporting, I returned a page with a meta refresh tag in the header, redirecting the script back to itself with the next block's table number and start row in the query string.

<?php if(!$all_done){
    $new_url=$_SERVER['PHP_SELF'].'?tablecount='.$count;
    if(!$tabledone && ""!=$start_row && null!=$start_row){
        $new_url.="&startrow=".$start_row;
    } else {
        $new_url.="&startrow=0";
    }
    echo('<meta http-equiv="refresh" content="0.5;url='.$new_url.'" />');
} ?>

The counters were so I could iterate through an array of table names that I'd retrieved with SHOW TABLES.

Before I had the wits to cull the gigantic word-match table (which phpBB can rebuild by itself) from the export, this back-up script would take over half an hour to complete.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top