|
php.net | support | documentation | report a bug | advanced search | search howto | statistics | random bug | login |
[2021-05-06 18:57 UTC] jon dot johnson at ucsf dot edu
Description:
------------
When working with a tar file using PharData memory increases and is not released. I first noticed this when using PharData::extractTo(), but it is more easily reproduced with PharData::addEmptyDir() as I've done below. It seems to be relative to to the size fo the archive, but I didn't confirm this.
Seems to exist in all versions of PHP, tested with:
docker container run --rm -v $(pwd):/test/ php:5-cli php /test/test.php
docker container run --rm -v $(pwd):/test/ php:7-cli php /test/test.php
docker container run --rm -v $(pwd):/test/ php:8-cli php /test/test.php
and got the same result.
Test script:
---------------
<?php
echo 'Start: ' . memory_get_usage() . "\n\n";
for ($i = 0; $i < 10; $i++) {
$path = __DIR__ . DIRECTORY_SEPARATOR . $i . '.tar';
$phar = new PharData($path);
$phar->addEmptyDir('test');
unset($phar);
unlink($path);
echo "After ${i}: " . memory_get_usage() . "\n";
}
gc_collect_cycles();
echo "\nEnd: " . memory_get_usage() . "\n";
Expected result:
----------------
I would expect each iteration fo the loop to be self contained (even without the manual steps to unset and unlink) and that memory consumption would be constant for this script no matter how many iterations were run.
Actual result:
--------------
php test.php
Start: 398248
After 0: 411936
After 1: 424968
After 2: 438000
After 3: 451032
After 4: 464064
After 5: 477096
After 6: 490128
After 7: 503160
After 8: 516512
After 9: 529544
End: 529504
PatchesPull RequestsHistoryAllCommentsChangesGit/SVN commits
|
|||||||||||||||||||||||||||||||||||||
Copyright © 2001-2025 The PHP GroupAll rights reserved. |
Last updated: Sun Oct 26 10:00:01 2025 UTC |
Changing the size of the tar file increases the amount of memory used. I would expect a map to grow constantly with each entry not scale depending on the file size otherwise there should be a way to clear this map when working with multiple files. Adding an inner loop to add more to each file in my example will increase the memory consumed. for ($j = 0; $j < 10; $j++) { $phar->addFromString("test-file-${j}.php", file_get_contents(__FILE__)); }