|
php.net | support | documentation | report a bug | advanced search | search howto | statistics | random bug | login |
PatchesPull RequestsHistoryAllCommentsChangesGit/SVN commits
[2003-05-24 11:41 UTC] moriyoshi@php.net
|
|||||||||||||||||||||||||||
Copyright © 2001-2025 The PHP GroupAll rights reserved. |
Last updated: Mon Dec 15 18:00:01 2025 UTC |
I used the following script to generate a text string text.php: $str = ""; for($i = 0; $i < 12950; $i++) { if($i%3 == 0) { $str .= "a"; } elseif($i%3 == 1) { $str .= "b"; } else { $str .= " "; } } echo $str; Then I read the file with either file() or file_get_contents() like: $url = "http://localhost/text.php"; $result = file_get_contents($url); /*$res_arr = file($url); $result = join("",$res_arr);*/ printf("%s\n", $result); With small files everything works fine. But starting from a file size of about 12950 bytes the functions the content of the test file gets crippled. The bigger the file the worse it gets. Large parts of file get cut. My system: OS: Windows XP home SP1 Server: Apache 2.0.44 PHP Verson: 4.3.2RC4 The bug cannot be reproduced when I call a text file by the file system (C:\text.txt) or using the php4ts.dll of PHP 4.3.1