|
php.net | support | documentation | report a bug | advanced search | search howto | statistics | random bug | login |
[2010-07-05 16:13 UTC] wimroffel at planet dot nl
Description:
------------
For my site I generate a large number of PHP pages from a PHP script. Experience learns that PHP is not totally realiable and that some of the pages will contain errors. The errors may be small but unlike html PHP is unforgiving and it immediately results in a single php error line.
So I made a script to check all my php files. I use the output buffering functions to direct their output to a file. I then measure the size of this output. If this is extremely small it must be a PHP error line.
After reading some 30 to 120 files PHP just stops. For a given directory it is always the same. Total filesize (all files together) before the stop varied between 500 and 3500 kb.
Test script:
---------------
function checkfile($myfile)
{ if (!file_exists($myfile)) return;
echo $myfile."<br>";
ob_start();
include($myfile);
$data = ob_get_contents();
ob_end_clean();
if(strlen($data) < 500) echo "<h2>".$myfile." is too small!</h2>";
}
$dp = opendir("./"));
while( $dirfile = readdir($dp))
{ if(($dirfile == ".") || ($dirfile == "..")) // all files are php
continue;
checkfile($dirfile);
}
closedir($dp);
Expected result:
----------------
I would expect that the script would process all the files in the directory
Actual result:
--------------
Instead it just processes a certain number and then stops. This is no timeout: it happens within a second.
PatchesPull RequestsHistoryAllCommentsChangesGit/SVN commits
|
|||||||||||||||||||||||||||
Copyright © 2001-2025 The PHP GroupAll rights reserved. |
Last updated: Fri Oct 24 17:00:01 2025 UTC |
A better way to do what you are trying to do is to run each file through a separate php processor so a fatal error won't terminate your checking script. Like this: foreach(glob("./*.php") as $f) { $out = null; exec("php -l $f", $out, $ret); if($ret) { echo "Problem in $f\n{$out[1]}"; } }