I am using a cloud hosting. The php script run time varies in difference Vertical scale setting:

  • 10s: 128MB - 200Mhz
  • 1s : 2Gb - 3 Ghz

But I feel the execution time in both case less than 1s, because I set_time_limit(1) and it does not show time out error.

To be sure, I use getrusage() to calculate the execution time. In both case, they are less than 1s.

What is this behavior? Why does it take 10s for the 1st setting to execute 1s task without showing time out error?

<?php
set_time_limit(1);
$rustart = getrusage();
echo date("m/d/Y h:i:s a", time()).'<br>';

//Begin of the task
$a = 0;
for ($i=0; $i<6000000; $i++) {
  $a = $a + $i;
  if ($i % 1000000 == 0){
    echo "$i <br>";
  }
}
//End of the task

echo date("m/d/Y H:i:s a", time());
echo '<br>';

function rutime($ru, $rus, $index) {
    return ($ru["ru_$index.tv_sec"]*1000 + intval($ru["ru_$index.tv_usec"]/1000))
    ;
}
$ru = getrusage();
echo "This process used " . rutime($ru, $rustart, "utime") .
    " ms for its computations<br>";
echo "It spent " . rutime($ru, $rustart, "stime") .
    " ms in system calls<br>";
?>
有帮助吗?

解决方案

Seems to me like you are miscalculating something in rutime().

Why don't you use something simpler to measure the execution time like this:

$start = microtime(true);

//your stuff here

$end = microtime(true);
$finish = $end - $start;
echo $finish;
许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top