ChatGPT解决这个技术问题 Extra ChatGPT

How to catch the fatal error: Maximum execution time of 30 seconds exceeded in PHP

I've been playing around with a system I'm developing and managed to get it to cause this:

Fatal error: Maximum execution time of 30 seconds exceeded

It happened when I was doing something unrealistic, but nevertheless it could happen with a user.

Does anyone know if there is a way to catch this exception? I've read around but everyone seems to suggest upping the time allowed.

I believe that once the execution time is exceeded, the script is terminated. In that case, the script which might have caught the exception has been killed already.
Fatal Errors cannot be catched (they are not exceptions) or handled. You can only handle script termination gracefully by registering a shutdown function but the script will end afterwards.
possible duplicate of How do I catch a PHP Fatal Error
No, the possible duplicate isnt an "exact" duplicate. It doesnt ask for the fatal raised by script timeout, but only how to catch a Fatal Error in general. But since the general also answers the specific in this case, it qualifies for being a possible duplicate. If the question is how to solve a script timeout error, then its a possible duplicate of most of these stackoverflow.com/search?q=maximum+execution+time+php ;)
I can't answer this generally speaking, but in some situations you can limit the time on something to less than the max_execution_time, and then catch that. For example, curl_setopt($client, CURLOPT_TIMEOUT, 27); will cause curl to give up after 27 seconds, so that you don't trip up the fatal error.

J
John

How about trying as PHP documentation (well... at least one of its readers) say:

<?php
function shutdown()
{
 $a = error_get_last();

 if ($a == null) {echo "No errors";}
 else {print_r($a);}
}

register_shutdown_function('shutdown');
ini_set('max_execution_time', 1);
sleep(3);
?>

Have a look at the following links:

http://www.php.net/manual/en/function.set-error-handler.php#106061 http://www.php.net/manual/en/function.register-shutdown-function.php


This might be the best kept secret of php I've never seen. Awesome.
I tested it. But still fatal error message is displayed. Can it be possible to avoid it. Any way I used ini_set('display_errors', '0');
sleep() doesn't actually affect the time limit counter, which is weird, but true. So you should do something else to test your shutdown() function - a couple of nested for loops each iterating a million times should do it.
@BobbyJack While it's weird, it makes sense since sleeping a thread stops execution for a period of time. Because its not executing, that time is not considered elapsed execution time.
Ineed that's a good alternative. In my case, by using this function I can redirect the user to another page/server using javascript as a final resource.
c
cwallenpoole

Your only options are to increase the allowed execution time (setting it to 0 makes it infinite, but that is not recommended) of the script or spawn a new thread and hope for the best.

The reason that this isn't catchable is that it isn't really thrown. No one line of the code actually triggered the error, rather PHP said, "Nope, sorry, this is too long. Time to shut down now." And that makes sense. Imagine having a script with a max execution time of 30 seconds catching that error and taking another 30 seconds... in a poorly designed program, that opens up some rather nasty opportunities to exploit. At a minimum, it will create opportunities for DOS attacks.


What about setting the exception time to 0 and then doing my own timeout?
That could work in some cases but not all. It would probably be best if it were in some form of loop so you could test regularly.
Hmm. The reason I noticed it was when receiving packet chunks. I set the packet size to be something really small and it took forever to send. My thinking is, if there was a lot of data to be sent, with the packet size at something like 1024bytes, I could set my own timeout on that loop and react within so many seconds to warn the user.
I object to being nannied by PHP as to whether a coding flaw allows infinite loops, or DOS exploits, etc. I think it should be up to the programmer to handle the error. Maybe the timeout is happening during a cURL, and I want the page to refresh infinitely until it reconnnects to the url (because the T1 line is down, or maybe the remote server, who knows?). I don't want my pre-processor, whether that's php, asp, cgi, etc., to make that decision for me. I don't think the pre-processor, or even client-side language like javascript should be making ethical or morality decisions on my behalf.
There are other options that you can take related to timeouts, but those aren't covered by this question. Most common practice is to simply make sure that a request doesn't take more than 60s. At that point you're not going to be public facing anyway.
G
GordonM

This isn't an exception, it's an error. There are important differences between exceptions and errors, first and foremost errors can't be caught with try/catch semantics.

PHP scripts are built around a paradigm of short execution times, so PHP is configured by default to assume that if a script has been running for longer than 30 seconds it must be caught in an infinite loop and therefore should be terminated. This is to prevent an errant PHP script causing a denial of service, either by accident or by malicious intent.

However, scripts do sometimes need more running time than they are allocated by default.

You can try changing the maximum execution time, either by using set_time_limit() or by altering the value of max_execution_time in the php.ini file to raise the limit. you can also remove the limit entirely by setting the execution time to 0, though this isn't recommended.

set_time_limit() may be disabled by mechanisms such as disable_functions so it might not be available to you, likewise you might not have access to php.ini. If both of these are the case then you should contact your host for help.

One exception is PHP scripts run from the command line. Under these running conditions, PHP scripts may be interactive and need to spend a long time processing data or waiting for input. For this reason there isn't a max_execution_time limit on scripts run from the command line by default.

EDIT TO ADD: PHP 7's error handling had a major overhaul. I believe that errors and exceptions are now both subclasses of Throwable. This may make the above no longer relevant for PHP7+, though I'll have to look more closely into the specifics of how error handling works now to be sure.


p
pinkal vansia

There is nothing you can do about it. but you can have graceful shutdown using register_shutdown_function

<?php 

ini_set('display_errors', '0');         
ini_set("max_execution_time",15 ); //you can use this if you know your script should not take longer than 15 seconds to finish

register_shutdown_function('shutdown');


function shutdown() 
{ 
       $error = error_get_last();

       if ($error['type'] === E_ERROR) {

      //do your shutdown stuff here
      //be care full do not call any other function from within shutdown function
      //as php may not wait until that function finishes
      //its a strange behavior. During testing I realized that if function is called 
      //from here that function may or may not finish and code below that function
      //call may or may not get executed. every time I had a different result. 

      // e.g.

      other_function();

      //code below this function may not get executed

       } 

} 

while(true)
{

}

function other_function()
{
  //code in this function may not get executed if this function
  //called from shutdown function
} 

?>

would other_function() be called during shutdown if it was declared before the infinite loop?
@Douglas.Sesar other_function() will be called any way. But there is no guaranty that it will finish and any code below function call. If other_function() takes very little time to do very little thing , it might finish and any code below function call. But if it takes too much time to accomplish task, than, there is no guaranty that it will finish entirely!!
In the strictest sense of the OP question, this solution works. I want to acknowledge spring boarding my slightly elaborated answer off of your answer.
W
Walf

Yeah I tested the solution by TheJanOnline. sleep() does not count into php execution time so here is WORKING version with indefinite loop:

<?php 
function shutdown() 
 { 
     $a=error_get_last(); 
     if($a==null)   
         echo "No errors"; 
     else 
          print_r($a); 

 } 
register_shutdown_function('shutdown'); 
ini_set('max_execution_time',1 ); 
while(1) {/*nothing*/}
// will die after 1 sec and print error
?>

u
user3901681

There is a little tricky way to handle "Fatal error: Maximum execution time of 30 seconds exceeded" as exception in certain cases:

function time_sublimit($k = 0.8) {
    $limit = ini_get('max_execution_time'); // changes even when you set_time_limit()
    $sub_limit = round($limit * $k);
    if($sub_limit === 0) {
        $sub_limit = INF;
    }
    return $sub_limit;
}

In your code you must to measure execution time and throw exception earlier than the timeout fatal error may be triggered. $k = 0.8 is a 80% of allowed execution time, so you have 20% of time to handle exception.

try{
    $t1 = time(); // start to mesure time.
    while (true) { // put your long-time loop condition here
        time_spent = time() - $t1;
        if(time_spent >= time_sublimit()) {
            throw new Exception('Time sublimit reached');
        }
        // do work here
    }
} catch(Exception $e) {
    // catch exception here
}

This will work if you have work that is segmented nicely such that it can be iterated over, but if a single big operation or sequence of operations is hanging around until the time limit, you're still largely SOL.
T
TARKUS

I came up with this based on the answer @pinkal-vansia gave. So I'm not claiming an original answer, but an answer with a practical application. I needed a way for the page to refresh itself in the event of a timeout. Since I have been observing enough timeouts of my cURL script to know the code is working, but that sometimes for whatever reason it fails to connect to the remote server, or read the served html fully, and that upon refresh the problem goes away, I am ok with script refreshing itself to "cure" a Maximum execution timeout error.

<?php //script name: scrape_script.php

ini_set('max_execution_time', 300);

register_shutdown_function('shutdown');

function shutdown() 
{ 
    ?><meta http-equiv="refresh" content="0; url=scrape_script.php"><?php
    // just do a meta refresh. Haven't tested with header location, but
    // this works fine.
}

FYI, 300 seconds is not too long for the scraping script I'm running, which takes just a little less than that to extract the data from the kinds of pages I'm scraping. Sometimes it goes over by just a few seconds only due to connection irregularities. Knowing that it's connection times that sometimes fail, rather than script processing, it's better to not increase the timeout, but rather just automatically refresh the page and try again.


A
Antônio Medeiros

I faced a similar problem and here was how I solved it:

<?php
function shutdown() {
    if (!is_null($error = error_get_last())) {
        if (strpos($error['message'], 'Maximum execution time') === false) {
            echo 'Other error: ' . print_r($error, true);
        } else {
            echo "Timeout!\n";
        }
    }
}

ini_set('display_errors', 0);
register_shutdown_function('shutdown');
set_time_limit(1);

echo "Starting...\n";
$i = 0;
while (++$i < 100000001) {
    if ($i % 100000 == 0) {
        echo ($i / 100000), "\n";
    }
}
echo "done.\n";
?>

This script, as is, is going to print Timeout! at the end.

You can modify the line $i = 0; to $i = 1 / 0; and it is going to print:

Other error: Array
(
    [type] => 2
    [message] => Division by zero
    [file] => /home/user/test.php
    [line] => 17
)

References:

PHP: register_shutdown_function - Manual

PHP: set_time_limit - Manual

PHP: error_get_last - Manual