|
php.net | support | documentation | report a bug | advanced search | search howto | statistics | random bug | login |
PatchesPull RequestsHistoryAllCommentsChangesGit/SVN commits
[2009-05-05 17:48 UTC] lbarnaud@php.net
|
|||||||||||||||||||||||||||
Copyright © 2001-2025 The PHP GroupAll rights reserved. |
Last updated: Sun Nov 02 21:00:02 2025 UTC |
Description: ------------ When using file_get_contents or fread to read a file completely in memory, PHP allocates twice the filesize memory. PHP versions 5.2.5 and 5.2.6 don't have this problem but 5.2.8 and 5.2.9 do. We're working a lot with large files and this problem affects the memory usage (out of memory) and the performance (longer duration). Are there any alternatives to work-around this problem besides not reading the complete file in memory? Reproduce code: --------------- $filePath = tempnam(sys_get_temp_dir(), ''); $data = str_pad('', 1024); if (($fh = fopen($filePath, 'wb'))){ for ($i=0; $i < 4096; $i++){ fputs($fh, $data); } fclose($fh); $filesize = filesize($filePath); print "filesize = " . number_format($filesize) . "\n"; print "Memory usage at start\n"; print "mem = " . number_format(memory_get_usage()) . "; peak = " . number_format(memory_get_peak_usage()) . "\n"; $x = file_get_contents($filePath); print "\nAfter\n"; print "mem = " . number_format(memory_get_usage()) . "; peak = " . number_format(memory_get_peak_usage()) . "\n"; unlink($filePath); } Expected result: ---------------- PHP allocates once the filesize memory. filesize = 4,194,304 Memory usage at start mem = 66,992; peak = 90,228 After mem = 4,261,396; peak = 4,269,740 Actual result: -------------- filesize = 4,194,304 Memory usage at start mem = 71,300; peak = 96,044 After mem = 4,265,704; peak = 8,460,160