I have to process some large CSV files that generate a lot of SQL statements that need to be executed. Naturally, trying to parse any of the files almost always results in my script spitting back that horrible maximum execution time exceeded error message, even though I’ve adjusted the maximum execution time setting in the php.ini file to as large as I dare go.

So how does one go about forcing a script to stay alive infinitely until it eventually finishes its job?

(Note: You really don’t want to apply what follows to an infinite loop snippet of code!)

Well, PHP does hand us the nifty set_time_limit() function that basically restarts PHP’s built in timeout counter, setting it to zero and then changing the new timeout value to the number of seconds specified in the function call. So for example, if the timeout default is 30 seconds and you call set_time_limit(20) 25 seconds into script execution, the script will now be able to run 45 seconds before timing out.

Now calling the function with a seconds parameter of zero is said to remove the time limit altogether, though in practice you may find that this doesn’t always work exactly how it should.

If for example your long-running script is based on a long loop operation, the easiest way to ensure your script doesn’t time out is to call the set_time_limit function with a specified timeout duration of say 20 seconds for each and every loop iteration.

This will in essence keep resetting the timeout counter and extending the maximum execution time, thus resulting in a script that has a potential to run just about forever! :)

[Unless of course you are running your script under II7 on a Windows Server 2008 machine where you’ll have to adjust some additional Windows Environment parameters! Something to note though is that this function won’t work if you are running PHP in SAFE mode. Unfortunately there doesn’t seem to be a workaround for this instance! :( ]

Related Link: http://php.net/manual/en/function.set-time-limit.php