php.net |  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Bug #18847 Output buffering problem
Submitted: 2002-08-10 17:54 UTC Modified: 2002-08-15 08:38 UTC
From: ntpt at centrum dot cz Assigned:
Status: Closed Package: Output Control
PHP Version: 4.1.2, 4.0.6, 4.2.2 OS: RH 7.3
Private report: No CVE-ID: None
View Add Comment Developer Edit
Welcome! If you don't have a Git account, you can't do anything here.
You can add a comment by following this link or if you reported this bug, you can edit this bug over here.
(description)
Block user comment
Status: Assign to:
Package:
Bug Type:
Summary:
From: ntpt at centrum dot cz
New email:
PHP Version: OS:

 

 [2002-08-10 17:54 UTC] ntpt at centrum dot cz
It seems that output buffering can not handle more , than aprox 1.5 Mb output. If i done something like:

<?
ob_start();
...
...

--generate some BIG output content, ie bightml page



$data=ob_get_contents();
?>

$data can never got more then approx 1.5 mb of data. The rest of buffer is cut off and lost. 
It seems to be more general problem in output buffering, because I try to make some workaround of this (like  using ob_get_length(),and if the length of the buffer exceeded some size then  ob_get_content(), ob_end_clean(), ob_start()inside the content creation loop, hoping, that i put content of the buffer into the some variable, clean it and start again ovewrcome this limit ), but it fails. The same behavior have 4.0.6 too 


Patches

Add a Patch

Pull Requests

Add a Pull Request

History

AllCommentsChangesGit/SVN commitsRelated reports
 [2002-08-10 18:01 UTC] kalowsky@php.net
Please be more specific on how it fails.  Does it just end and nothing else happens?  Have you tried increasing the max_execution_time value to something else? Have you tried a the PHP 4.2.2 release to see if this occurs there as well?


 [2002-08-10 18:24 UTC] ntpt at centrum dot cz
I have a script, that generate content (html page). When all page is generated, I need grab script output and do some processing on it. And then save it to file or/and output it to the browser.If the generated page is small, everything works fine. 

It fails because it fetch only a certain portion of data. Ie if some part of the script (that work without outputbuffering fine ), produce  cca 3 MB big html page and then try to fetch content of the output buffer into the variable $data, it fetch only cca 1.5 Mb into.

No error messages, crashes, segfaults etc..... Even script execution is not halted, because browser show  correct (so correct as it can be without lost data....) results of future processing of $data done by the running script...

I try to paly with php.ini, increasing script execution time, increasing memory usage limits atd, trying add some memory to my machine with absolutely no effect on that case....

4.2.2 I do not try, I am affraid, that I will not be able succefuly build it and integrate with my distribution and with all the extensions I need to run in our production server... 

PS: Sory for my bad english...
 [2002-08-11 15:54 UTC] ntpt at centrum dot cz
I get and compiled the latest PHP4.2.2 (wasnot so horrible thing as I affraid....) and tested for this bug.

Laterst PHP 4.2.2 suffer this bug too.......  It's behavior is exactly the same as described on 4.0.6 and 4.1.2.....
 [2002-08-11 23:21 UTC] yohgaki@php.net
You are exhausting memory mostly likely.
Get rid of memory-limit or set it to large enough number.

Output buffer may consume a lot of memory depends of how you are using it.


 [2002-08-12 04:25 UTC] ntpt at centrum dot cz
I try to increase memeory limit via php.ini from originally 8Mb to 16, 24 or even 32 and 64 MB.... and restart apache  server (PHP compilled as apache module.....) 

I try to increase script execution time to 120 and 180 s.  But i thing this was useless, because without output buffering, the script run fine and take aprox 10-15 s.

My linux box have  96 Mb of memory +256 mb swap   I add another 128 MB of memory, another 64 MB swap to my configuration and even cut it off the Internet conectivity so nobody use the server at the time of my test.

 This changes have absolutelly no effect at all. Still the SAME size of the output buffer is returned.

workarounds of this problem, that I try to do :


$ob_treshold=250000; //fill buffer with max 256 kb of data

for($i=0;$i<$lines;$i++) // main content generator loop of the script
	{
	$data = pg_fetch_array($result,$i);

	display_recort($data); // this create html table filled with data from DB
	
	// workaround start
	if (ob_get_length() > $ob_treshold) // output buffer is biger then $ob_treshold 
		{
		$buffer=$buffer.ob_get_contents(); 
		ob_end_clean(); // discard output buffer and stop buffering
		ob_start(); // start a new buffer 	
		}
	
	// workaround end

	}

fails too, fetching only the same size of the data  (cca 1.5) to variable $buffer. 


The size of the fetchet data is always the same, changes  that I describe above have no effect on fetched size....
 [2002-08-12 09:33 UTC] ntpt at centrum dot cz
It seems that it is not a memory limit problem. I just compiled php 4.2.2 with --disable-memory-limit , so it should not use any now,but result is the same....
 [2002-08-15 08:38 UTC] ntpt at centrum dot cz
I currently find where the problem is. bug described here is only sideefect. So i set status here to "closed" and start  another bugreport about it, with more exact description of the problem.
 
PHP Copyright © 2001-2024 The PHP Group
All rights reserved.
Last updated: Sat Apr 20 03:01:28 2024 UTC