php.net |  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Bug #38950 script times out long befor reaching max. execution time
Submitted: 2006-09-25 15:01 UTC Modified: 2006-09-26 15:57 UTC
Votes:1
Avg. Score:5.0 ± 0.0
Reproduced:0 of 0 (0.0%)
From: pavel dot stratil-jun at fenix dot cz Assigned:
Status: Closed Package: *Encryption and hash functions
PHP Version: 5.1.6 OS: gentoo (amd64smp)
Private report: No CVE-ID: None
View Add Comment Developer Edit
Anyone can comment on a bug. Have a simpler test case? Does it work for you on a different platform? Let us know!
Just going to say 'Me too!'? Don't clutter the database with that please !
Your email address:
MUST BE VALID
Solve the problem:
32 + 4 = ?
Subscribe to this entry?

 
 [2006-09-25 15:01 UTC] pavel dot stratil-jun at fenix dot cz
Description:
------------
when trying to hash large files (cant say yet how large, but a 800MB file was hashed without problems but a 1300MB file not), the script times out after about 1 minute even if max. execution time is set to half an hour. smaller files, such as the 800MB file were successfully hashed a few times using different hashing methods (in total the script took about 4 minutes to run).

Reproduce code:
---------------
// tried both:

    $fp = fopen('file.ext', "r");
    $ctx = hash_init('sha512');
    while (!feof($fp)) {
        hash_update($ctx, fgets($fp,$bytes));
    }
    $res_hash = hash_final($ctx);  // this would be line 58
    fclose($fp);

// and

    hash_file('sha512', 'file.ext'); // this would be line 67

// $bytes were set to anywhere from 4kB to 32MB.. same result all the time

Expected result:
----------------
obviously, a hash.

Actual result:
--------------
Fatal error:  Maximum execution time of 1800 seconds exceeded in /opt/apache/htdocs/hashtest.php on line 58.

or

Fatal error:  Maximum execution time of 1800 seconds exceeded in /opt/apache/htdocs/hashtest.php on line 67.

depending what piece of code was commented.

Patches

Add a Patch

Pull Requests

Add a Pull Request

History

AllCommentsChangesGit/SVN commitsRelated reports
 [2006-09-26 09:13 UTC] tony2001@php.net
Please try using this CVS snapshot:

  http://snaps.php.net/php5.2-latest.tar.gz
 
For Windows:
 
  http://snaps.php.net/win32/php5.2-win32-latest.zip


 [2006-09-26 12:45 UTC] pavel dot stratil-jun at fenix dot cz
seems that the problem is in

    $fp = fopen('test', "r");
    $ctx = hash_init('sha512');
    while (!feof($fp)) {
        hash_update($ctx, fgets($fp, 4096));
    }
    $uplo_mhash = hash_final($ctx);
    fclose($fp);

when calling the script from apache. When running from shell the problem disappears completely. In apache the premature timeout problem disappeared in the snapshot version, but streaming hashing still fails on large files (the script timeouts for real after max_execution_time even on relatively small files, compared to the first tests - i.e. 200MB). Hash_file() seems to work flawlesly. When going towards smaller files, streaming hashing catches on and is about 60% the speed of hash_file.

Tried this with apache 2.2.2 and 2.2.3 with modified as well as distribution configurations with the same result.


tested on 
php5.2-200609261030
./configure --prefix=${PHP_PATH} --with-apxs2=${APACHE_PATH}/bin/apxs
gmake      # ok
gmake test # failed in 4 tests

# Test for buffering in core functions with implicit flush off [tests/func/008.phpt]
# Bug #16069 [ext/iconv/tests/bug16069.phpt]
#iconv stream filter [ext/iconv/tests/iconv_stream_filter.phpt]
# Math constants [ext/standard/tests/math/constants.phpt]

ps: dont know how far the development of php 5.2 is but i was getting many compile failures when trying to build with some common extensions such as imap (against imap2006) or mysqli (5.1.11).
 [2006-09-26 12:59 UTC] tony2001@php.net
>but streaming hashing still fails on large files (the 
>script timeouts for real after max_execution_time even on
> relatively small files, compared to the first tests - i.e.
>200MB)

Please elaborate.

>many compile failures when trying to build with some
> common extensions such as imap (against imap2006) or
> mysqli (5.1.11).

Please report them as separate issues.
 [2006-09-26 14:19 UTC] pavel dot stratil-jun at fenix dot cz
I cant find the rule in the problem. I was able to reproduce it always on files > 1.2GB. I could reproduce it sometimes on files ranging from 70MB to 1.2GB and I havent been able to reproduce on files < 70MB.
 [2006-09-26 14:24 UTC] tony2001@php.net
Reproduce what? What is the value of max_execution_time? What is the error message? What is the real amount of time spent and how did you measure it?
 [2006-09-26 15:14 UTC] pavel dot stratil-jun at fenix dot cz
* Reproduce what?
reproduce the timeout error when trying streaming hashing.

* What is the value of max_execution_time?
1800 seconds

* What is the error message? 
on php 5.1.6 after running for about a minute (checked with a stopwatch): Fatal error:  Maximum execution time of 1800 seconds exceeded in
/opt/apache/htdocs/hashtest.php on line 58.

on php 5.2 after running for 1800 seconds (checked with a stopwatch): Fatal error:  Maximum execution time of 1800 seconds exceeded in
/opt/apache/htdocs/hashtest.php on line 58.


My totally naive guess is that during the while loop the eof might not be properly recognised (on both versions) or that the loop itself has some counter which for some reason signals to php that the script timed out even if it didnt.
 [2006-09-26 15:17 UTC] tony2001@php.net
>on php 5.2 after running for 1800 seconds (checked with a
> stopwatch): Fatal error:  Maximum execution time of 1800
> seconds exceeded 

So it does work fine, right?
 [2006-09-26 15:28 UTC] pavel dot stratil-jun at fenix dot cz
on 5.2? well yes and no. the timeout problem disappeared, but  the timeout itself is still there.

file_hash() does a 200MB file on my machine in about 4 seconds the checksum

stream hashing times out after half an hour!

and i am not sure if its a performance issue because on smaller files (anything from bytes to tens of MB) the stream hashing has 60% the speed of file_hash(), so extrapolating this i would expect a 200MB file to be hashed in say max 10 seconds, a timeout after 30 minutes is not ok
 [2006-09-26 15:57 UTC] tony2001@php.net
200MB file is hashed in 3 seconds using hash_file() and in 7 seconds using streams. The difference is expected because hash_file() uses static buffer, while hash_update()/fgets() use dynamic buffers.
 
PHP Copyright © 2001-2024 The PHP Group
All rights reserved.
Last updated: Thu Mar 28 14:01:29 2024 UTC