|  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Bug #16077 Large output data causes output buffering to crash
Submitted: 2002-03-14 12:46 UTC Modified: 2002-05-09 02:30 UTC
From: harry dot brueckner at orange-digital dot de Assigned:
Status: Not a bug Package: Output Control
PHP Version: 4.1.1 OS: RH Linux 2.4.9-31
Private report: No CVE-ID: None
 [2002-03-14 12:46 UTC] harry dot brueckner at orange-digital dot de
This script causes PHP to crash (sigsegv, return nothing OR to cause wget to tell "HTTP request sent, awaiting response... End of file while parsing headers. Retrying.") almost any time. The include.txt file I used was larger than 100k and was just a simple text. The content of the file is not important at all, I tried several versions.


function getMicrotime()
    list($usec, $sec) = explode(" ", microtime());

    return ((float)$usec + (float)$sec);

function timer($buffer)
    global $startTime;
    $endTime = getMicrotime();
    $diff = sprintf("%.5f", $endTime - $startTime);
    return $buffer . "\nExecution time: $diff sec<br>\n";

$startTime = getMicrotime();

for ($i = 0; $i < 500; $i++)
    $fh = fopen("include.txt", "r");
    $cmd = fread($fh, 1048576);

    echo "$i $cmd\n";


I compiled PHP with

./configure --prefix=/usr/local/php --disable-short-tags --enable-safe-mode --enable-ftp --with-mysql=/usr/local/mysql --with-zlib --enable-memory-limit


Add a Patch

Pull Requests

Add a Pull Request


AllCommentsChangesGit/SVN commitsRelated reports
 [2002-05-03 23:25 UTC]
This should be chunked output problem.
 [2002-05-09 02:30 UTC]
ob_start() disables chunked output by default. Your script should work if ob_start('timemer', 4096) for example.

Not like output_buffering ini directive, ob_start()'s default chunk size is 0 (which means no chunked output -> you're exhousting memory)

PHP Copyright © 2001-2023 The PHP Group
All rights reserved.
Last updated: Fri Mar 31 20:03:38 2023 UTC