I've been playing around with a system I'm developing and managed to get it to cause this:
Fatal error: Maximum execution time of 30 seconds exceeded
It happened when I was doing something unrealistic, but nevertheless it could happen with a user.
Does anyone know if there is a way to catch this exception? I've read around but everyone seems to suggest upping the time allowed.
How about trying as PHP documentation (well... at least one of its readers) say:
<?php
function shutdown()
{
$a = error_get_last();
if ($a == null) {echo "No errors";}
else {print_r($a);}
}
register_shutdown_function('shutdown');
ini_set('max_execution_time', 1);
sleep(3);
?>
Have a look at the following links:
http://www.php.net/manual/en/function.set-error-handler.php#106061 http://www.php.net/manual/en/function.register-shutdown-function.php
Your only options are to increase the allowed execution time (setting it to 0 makes it infinite, but that is not recommended) of the script or spawn a new thread and hope for the best.
The reason that this isn't catchable is that it isn't really thrown. No one line of the code actually triggered the error, rather PHP said, "Nope, sorry, this is too long. Time to shut down now." And that makes sense. Imagine having a script with a max execution time of 30 seconds catching that error and taking another 30 seconds... in a poorly designed program, that opens up some rather nasty opportunities to exploit. At a minimum, it will create opportunities for DOS attacks.
This isn't an exception, it's an error. There are important differences between exceptions and errors, first and foremost errors can't be caught with try/catch semantics.
PHP scripts are built around a paradigm of short execution times, so PHP is configured by default to assume that if a script has been running for longer than 30 seconds it must be caught in an infinite loop and therefore should be terminated. This is to prevent an errant PHP script causing a denial of service, either by accident or by malicious intent.
However, scripts do sometimes need more running time than they are allocated by default.
You can try changing the maximum execution time, either by using set_time_limit()
or by altering the value of max_execution_time
in the php.ini
file to raise the limit. you can also remove the limit entirely by setting the execution time to 0, though this isn't recommended.
set_time_limit()
may be disabled by mechanisms such as disable_functions
so it might not be available to you, likewise you might not have access to php.ini
. If both of these are the case then you should contact your host for help.
One exception is PHP scripts run from the command line. Under these running conditions, PHP scripts may be interactive and need to spend a long time processing data or waiting for input. For this reason there isn't a max_execution_time
limit on scripts run from the command line by default.
EDIT TO ADD: PHP 7's error handling had a major overhaul. I believe that errors and exceptions are now both subclasses of Throwable. This may make the above no longer relevant for PHP7+, though I'll have to look more closely into the specifics of how error handling works now to be sure.
There is nothing you can do about it. but you can have graceful shutdown using register_shutdown_function
<?php
ini_set('display_errors', '0');
ini_set("max_execution_time",15 ); //you can use this if you know your script should not take longer than 15 seconds to finish
register_shutdown_function('shutdown');
function shutdown()
{
$error = error_get_last();
if ($error['type'] === E_ERROR) {
//do your shutdown stuff here
//be care full do not call any other function from within shutdown function
//as php may not wait until that function finishes
//its a strange behavior. During testing I realized that if function is called
//from here that function may or may not finish and code below that function
//call may or may not get executed. every time I had a different result.
// e.g.
other_function();
//code below this function may not get executed
}
}
while(true)
{
}
function other_function()
{
//code in this function may not get executed if this function
//called from shutdown function
}
?>
other_function()
be called during shutdown if it was declared before the infinite loop?
Yeah I tested the solution by TheJanOnline. sleep() does not count into php execution time so here is WORKING version with indefinite loop:
<?php
function shutdown()
{
$a=error_get_last();
if($a==null)
echo "No errors";
else
print_r($a);
}
register_shutdown_function('shutdown');
ini_set('max_execution_time',1 );
while(1) {/*nothing*/}
// will die after 1 sec and print error
?>
There is a little tricky way to handle "Fatal error: Maximum execution time of 30 seconds exceeded" as exception in certain cases:
function time_sublimit($k = 0.8) {
$limit = ini_get('max_execution_time'); // changes even when you set_time_limit()
$sub_limit = round($limit * $k);
if($sub_limit === 0) {
$sub_limit = INF;
}
return $sub_limit;
}
In your code you must to measure execution time and throw exception earlier than the timeout fatal error may be triggered. $k = 0.8 is a 80% of allowed execution time, so you have 20% of time to handle exception.
try{
$t1 = time(); // start to mesure time.
while (true) { // put your long-time loop condition here
time_spent = time() - $t1;
if(time_spent >= time_sublimit()) {
throw new Exception('Time sublimit reached');
}
// do work here
}
} catch(Exception $e) {
// catch exception here
}
I came up with this based on the answer @pinkal-vansia gave. So I'm not claiming an original answer, but an answer with a practical application. I needed a way for the page to refresh itself in the event of a timeout. Since I have been observing enough timeouts of my cURL script to know the code is working, but that sometimes for whatever reason it fails to connect to the remote server, or read the served html fully, and that upon refresh the problem goes away, I am ok with script refreshing itself to "cure" a Maximum execution timeout error.
<?php //script name: scrape_script.php
ini_set('max_execution_time', 300);
register_shutdown_function('shutdown');
function shutdown()
{
?><meta http-equiv="refresh" content="0; url=scrape_script.php"><?php
// just do a meta refresh. Haven't tested with header location, but
// this works fine.
}
FYI, 300 seconds is not too long for the scraping script I'm running, which takes just a little less than that to extract the data from the kinds of pages I'm scraping. Sometimes it goes over by just a few seconds only due to connection irregularities. Knowing that it's connection times that sometimes fail, rather than script processing, it's better to not increase the timeout, but rather just automatically refresh the page and try again.
I faced a similar problem and here was how I solved it:
<?php
function shutdown() {
if (!is_null($error = error_get_last())) {
if (strpos($error['message'], 'Maximum execution time') === false) {
echo 'Other error: ' . print_r($error, true);
} else {
echo "Timeout!\n";
}
}
}
ini_set('display_errors', 0);
register_shutdown_function('shutdown');
set_time_limit(1);
echo "Starting...\n";
$i = 0;
while (++$i < 100000001) {
if ($i % 100000 == 0) {
echo ($i / 100000), "\n";
}
}
echo "done.\n";
?>
This script, as is, is going to print Timeout!
at the end.
You can modify the line $i = 0;
to $i = 1 / 0;
and it is going to print:
Other error: Array
(
[type] => 2
[message] => Division by zero
[file] => /home/user/test.php
[line] => 17
)
References:
PHP: register_shutdown_function - Manual
PHP: set_time_limit - Manual
PHP: error_get_last - Manual
Success story sharing