|
php.net | support | documentation | report a bug | advanced search | search howto | statistics | random bug | login |
[2007-06-16 16:48 UTC] mplomer at gmx dot de
Description:
------------
When using php arrays with a lot of entries (about 200000), I figured out the following problem: PHP sometimes doesn't free all used memory after completing the request. Apache uses under some circumstances 20-30 MB more RAM after the request. The problem is that this happens per child. If an Apache runs with 64 threads, it is possible, that httpd.exe consumes persistently 300-600 MB of RAM, even without any active request.
Reproduce environment:
- I tested it under an Apache (1.3 and 2.2) environment unter Windows (XP) (I didn't test it under Linux yet)
- Used PHP-Version: 5.2.3 - the problem was introduced with PHP 5.2. With PHP 5.1.6 the problem does not appear (but I noticed, that PHP 5.1.6's memory management is much slower than the new one :-)
- Set ThreadsPerChild to 1 in httpd.conf to make sure, you hit always the same PHP instance and avoid any constraints
Reproduce procedure:
- Freshly start your 1-thread-Apache [It will consume about 10 MB]
- Execute the following script [Memory usage will grow to ~50-60 MB, and after execution memory usage shrinks back to ~10 MB again ... works fine so long]
- Execute the script again 2 or more times [... and surprisingly Apache consumes about 35MB after the request is complete!] (The number of executions you need to reproduce the problem depends on the elementCount of the test-array, and eventually some system dependent factors; see reproduce code)
- If you excecute the script some more times again, someimes the memory is freed and memory usage is about 10MB again, but after some further requests, the memory is NOT freed again.
Because of the last point, I think, it is not a memory leak. I also compiled PHP as debug version and there where no memory leaks reported (with report_memleaks = On).
But I still think, the consumed memory should be completely freed after _each_ request. If this is a feature, because something is cached, it requires a maximum of the cache size. 20-30MB per child is definitely too much.
If you put an "echo memory_get_peak_usage(true);" at the beginning of the script, you will see, that PHP claims to use only about 200-300 KB at every script start time. It doesn't report the 20-30MB that it consumes since the last execution.
I tested this with an array, but the problem can, of course, be deeper in the new memory management of PHP 5.2
Reproduce code:
---------------
<?php
// elementCount: Count of elements which are created in the test array:
// - a count of 200000 demonstrates the problem at best on my machine
// - a count of 100000 or lesser reproduces the problem too, but you need more
// calls of the script (browser refreshes) until the problem occurs
// - with a count of 400000 or more I couldn't reproduce the problem,
// there it works fine
// I couldn't figure out exactly, on which factors this count depends on, but
// it _seems_ to be not machine dependent, but play around with it.
// Note also, that the count of array elements is important, not the
// size of their keys or values.
$elementCount = 200000;
for ($i = 0; $i < $elementCount; $i++) {
$variables[$i] = 'x';
}
// Even when you unset each array-element manually, the problem occurs ...
//for ($i = 0; $i < $elementCount; $i++) {
// unset($variables[$i]);
//}
// ... and/or if you unset the array itself ...
//unset($variables);
// ... but it does not occur, when you put the unset directly in the first
// for-loop, that the variable will be unset immediately. Only a small amount
// of memory is required for the whole request - so PHP can't forgot to free
// many memory.
?>
Expected result:
----------------
see "Description": All memory being freed after execution of a PHP script.
Actual result:
--------------
see "Description": Not all memory is freed under some circumstances after script execution.
PatchesPull RequestsHistoryAllCommentsChangesGit/SVN commits
|
|||||||||||||||||||||||||||||||||||||
Copyright © 2001-2025 The PHP GroupAll rights reserved. |
Last updated: Thu Oct 30 06:00:02 2025 UTC |
OK, it took me some time to get FastCGI running, but now it works, and yes ... I can reproduce it there, too. To be more detailed, I set up the following environment: - Apache 2.0.59 (minimalistic configuration) with mod_fastcgi-SNAP-0404142202-AP2.dll - Configured FastCGI with 1 process: FastCgiServer ../php5/php-cgi.exe -processes 1 - PHP 5.2.3 (without php.ini; only php-cgi.exe and php5ts.dll) - and the test-script with 400,000 array elements: <?php $elementCount = 400000; for ($i = 0; $i < $elementCount; $i++) { $variables[$i] = 'x'; } ?> Reproducing by looking at the Task-Manager's memory usage values after each execution of the script: - Freshly starting Apache - Child process "php-cgi.exe" is started [php-cgi.exe consumes ~ 4.1 MB] - Execute the test-script the first time: [php-cgi.exe consumes ~ 4.8 MB] - Execute one more time: [php-cgi.exe consumes ~ 4.8 MB] - Execute one more time: [php-cgi.exe consumes ~ 5.3 MB] ... might be OK so long - Execute one more time: [php-cgi.exe consumes ~ 36.4 MB !!!] ... from now on this is not OK anymore - Execute one more time: [php-cgi.exe consumes ~ 36.7 MB] - Execute one more time: [php-cgi.exe consumes ~ 6.1 MB] - Execute one more time: [php-cgi.exe consumes ~ 36.4 MB] - Execute one more time: [php-cgi.exe consumes ~ 36.5 MB] - Execute one more time: [php-cgi.exe consumes ~ 7.7 MB] - Execute one more time: [php-cgi.exe consumes ~ 36.7 MB] - Execute one more time: [php-cgi.exe consumes ~ 36.8 MB] - Execute one more time: [php-cgi.exe consumes ~ 5.8 MB] - Execute one more time: [php-cgi.exe consumes ~ 36.4 MB] - Execute one more time: [php-cgi.exe consumes ~ 36.5 MB] - Execute one more time: [php-cgi.exe consumes ~ 10.6 MB] - Execute one more time: [php-cgi.exe consumes ~ 36.7 MB] - Execute one more time: [php-cgi.exe consumes ~ 36.8 MB] - Execute one more time: [php-cgi.exe consumes ~ 5.5 MB] - ... This is always reproducable, and the memory usage values are quite the same on each reproducion. (They differ only sometimes in 4-12 KB.) But you see, there is a systematic. There are 2 executions, after which the memory is not freed, and after the third execution, the memory is mostly freed. Any ideas for further testing?