Question

so I have a php script that requires data from a remote server. It goes like this:

set_time_limit(100);
ignore_user_abort(true);

$x = 0
while($x < 10){
    $data = file_get_contents("http://www.remoteserver.com/script");
    // do something with the data
}

// insert ALL the gained data in the local database

It takes about 10 seconds to complete if I open the script in my browser. However, I need to loop it not 10 times but 24 times. The script then takes about 22 seconds to complete. I cannot insert the data within the loop, I need to wait until the loop has finished.

Now, there interesting part is that when the loop is set to 10, the script manages to finish if it is run via cronjob. If it is set to 24, the script does not manage to finish and no data is inserted in the local database!

Why is this? Is there any solution to my problem? It works fine when I open it in my browser.

I call the script with this cronjob command:

php public_html/example.com/my_script.php

I get the same results with this cronjob command:

curl "http://example.com/my_script.php"
Was it helpful?

Solution

To run without timing out, set_time_limit(0).

To run without waiting for results, pipe the script.. php -f /path/to/file 2>&1

OTHER TIPS

I had similar problem (concurrency issues). You might have it also, if your script takes more time to execute then period in which cron calles it. Try locking file, something like:

$fp = fopen(dirname(__FILE__), 'r'); 
if (flock($fp, LOCK_EX | LOCK_NB)) // lock yourself
{
    ...do your thing...
    flock($fp, LOCK_UN); // unlock yourself
}
fclose($fp);
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top