|
php.net | support | documentation | report a bug | advanced search | search howto | statistics | random bug | login |
[2002-08-10 17:54 UTC] ntpt at centrum dot cz
It seems that output buffering can not handle more , than aprox 1.5 Mb output. If i done something like: <? ob_start(); ... ... --generate some BIG output content, ie bightml page $data=ob_get_contents(); ?> $data can never got more then approx 1.5 mb of data. The rest of buffer is cut off and lost. It seems to be more general problem in output buffering, because I try to make some workaround of this (like using ob_get_length(),and if the length of the buffer exceeded some size then ob_get_content(), ob_end_clean(), ob_start()inside the content creation loop, hoping, that i put content of the buffer into the some variable, clean it and start again ovewrcome this limit ), but it fails. The same behavior have 4.0.6 too PatchesPull RequestsHistoryAllCommentsChangesGit/SVN commits
|
|||||||||||||||||||||||||||
Copyright © 2001-2025 The PHP GroupAll rights reserved. |
Last updated: Thu Nov 06 14:00:01 2025 UTC |
I try to increase memeory limit via php.ini from originally 8Mb to 16, 24 or even 32 and 64 MB.... and restart apache server (PHP compilled as apache module.....) I try to increase script execution time to 120 and 180 s. But i thing this was useless, because without output buffering, the script run fine and take aprox 10-15 s. My linux box have 96 Mb of memory +256 mb swap I add another 128 MB of memory, another 64 MB swap to my configuration and even cut it off the Internet conectivity so nobody use the server at the time of my test. This changes have absolutelly no effect at all. Still the SAME size of the output buffer is returned. workarounds of this problem, that I try to do : $ob_treshold=250000; //fill buffer with max 256 kb of data for($i=0;$i<$lines;$i++) // main content generator loop of the script { $data = pg_fetch_array($result,$i); display_recort($data); // this create html table filled with data from DB // workaround start if (ob_get_length() > $ob_treshold) // output buffer is biger then $ob_treshold { $buffer=$buffer.ob_get_contents(); ob_end_clean(); // discard output buffer and stop buffering ob_start(); // start a new buffer } // workaround end } fails too, fetching only the same size of the data (cca 1.5) to variable $buffer. The size of the fetchet data is always the same, changes that I describe above have no effect on fetched size....