php.net |  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Bug #57132 memory leaks on php_zip
Submitted: 2006-07-11 04:45 UTC Modified: 2006-08-01 08:54 UTC
From: durmont at yahoo dot fr Assigned: pajoye (profile)
Status: No Feedback Package: zip (PECL)
PHP Version: 5.1.4 OS: windows 2003 server
Private report: No CVE-ID: None
View Developer Edit
Welcome! If you don't have a Git account, you can't do anything here.
If you reported this bug, you can edit this bug over here.
Block user comment
Status: Assign to:
Package:
Bug Type:
Summary:
From: durmont at yahoo dot fr
New email:
PHP Version: OS:

 

 [2006-07-11 04:45 UTC] durmont at yahoo dot fr
Description:
------------
actually it's with PHP 5.1.4 but it's not in the list...
I use php_zip.dll from pecl 5.1.4 found on php.net download page.
The simple codes(1) and (2) below are used to open somewhat large archives (about 60 Mb each, containing about 300 jpg photos)
These scripts are the only ones to be called on that server at the moment. After an hour and 200 users accessing the page, my apache threads are eating up 1Gb of ram, and this number never decreases.
I've added a cache system to my script so I don't extract the images from the zip at every call. It flattens the memory curve, but it's still increasing.

Reproduce code:
---------------
==== code 1 ====

$zip=zip_open($absolute_path);
while ($zip_entry=zip_read($zip))
{
	$this->nbImg++;
	zip_entry_close($zip_entry);
}
zip_close($zip);

==== code 2 ====
$zip=zip_open($absolute_path);
$zip_entry=zip_read($zip);
for ($i=0;$i<$img;$i++)
{
	zip_entry_close($zip_entry);
	$zip_entry=zip_read($zip);
}
$fic=zip_entry_read($zip_entry,zip_entry_filesize($zip_entry));
/* here I return $fic to the browser as an image file*/
zip_entry_close($zip_entry);

Expected result:
----------------
everything runs fine, exepted the memory usage going wild !


Patches

Pull Requests

History

AllCommentsChangesGit/SVN commitsRelated reports
 [2006-07-11 05:14 UTC] pierre dot php at gmail dot com
Memory going wild is not the same things as leaks. Leaks mean than the memory is not freed at the end of a script.

Can you tell me if it is a leak or simply using a lot of memory?

The later is normal if you use the old API. If you only want to extract the files, I recommand to use the new OO API. If you want to extract/and resize, check the stream example, you can open an image in a zip Archive directly from GD (see odt.php for a zip:// example).

You have examples in the package itself (available at pecl.php.net/zip) or from cvs (http://cvs.php.net/viewvc.cgi/pecl/zip/examples/)
 [2006-07-11 07:24 UTC] durmont at yahoo dot fr
Sorry, English is not my native language.
I really mean I have memory leaks, because the memory used by apache processes is never freed, even when nobody acesses the server.
I'll check the zip:// stuff anyway, didn't see that before :o)
 [2006-07-11 07:26 UTC] pierre dot php at gmail dot com
Hm, I did not notice any leak in these functions. All data used by the zip function are freed.

Au cas ou, tu peux me mailer en prive (pierre dot php at gmail.com) si c'est plus simple pour toi en francais :)
 [2006-07-12 05:17 UTC] durmont at yahoo dot fr
So, what's next ? How can we check it's a real memory leak (or not) ?
BTW, I'll be away for 3 weeks, but you can send me the tests, I'll run them when I'm back.

Et je vais continuer en anglais si ca ne te gene pas, ca me fait un bon exercice ! Je vais juste me chercher dans un bon dico la definition exacte de l'expression "going wild" ;o)
 [2006-07-12 05:26 UTC] pierre dot php at gmail dot com
To check leaks, you can run your code on the console, for example using CLI, with a php version compiled with the debug options (--enable-debug) and the zend memory managed enabled (it is by default).

If the memory is not freed, php will output errors on exit.

An alternative is to disable the zend memory manager and run your scripts through valgrind (valgrind php yourscript.php).

I use both methods before any zip release, I did not notice any leaks :P
 [2006-07-12 12:00 UTC] durmont at yahoo dot fr
Compile ??? I'm under windows here (sorry it's not my fault, I swear). Both methods you describe imply a re-compilation of php, but we don't own any visual C++ (not counting the fact that my sysadmin would probably kill me if he catches me doing that on an in-production server)
 [2006-07-12 12:10 UTC] pierre dot php at gmail dot com
You can fetch debug builds on http://snaps.php.net
 [2006-07-12 12:11 UTC] pierre dot php at gmail dot com
Also, if you give me an example zip (please not 300MB :) and a script to reproduce your problem, I can take a look.
 [2006-07-12 13:16 UTC] durmont at yahoo dot fr
Right, I'll send you one of the zip files (It's 64Mb, I'll upload it somewhere, and I'll email you the address) and an extract of the scripts (in 3 weeks, as I said before).
One specific point in my case is that I have more than 200 people accessing a single big zip file in a short period. Maybe the concurrent accesses put some trouble in the memory management ?
 [2006-07-12 13:20 UTC] pierre dot php at gmail dot com
If you have 200 users accessing the *same* zip archive in the same laps of time, I think you have a design problem.
 [2006-07-13 05:09 UTC] durmont at yahoo dot fr
Why ? 200 people reading the same web page, or viewing the same image at the same time is quite common. Why would it be strange to do the same thing with a resource enclosed in a zip archive ?
The site is a small web publishing system (on an intranet). Users may publish an article with an optional attachment. If this attachment is a zip containing images, they are displayed as a photo gallery (with on-the-fly generated thumbnails). Okay it's quite cpu-intensive, but speed optimization is never a concern here.
the file functions of php can open and read the same file multiple times simultaneously (yes I did it too...), so why wouldn't it work for zip ?
 [2006-07-13 05:27 UTC] pierre dot php at gmail dot com
It is normal to have 200 users reading the same page at the same time. It is slightly less normal to have 200 users accessing the same zip archive. You should extract and use  its contents instead of accessing the zip archive endlessly.

However, regarding the possible leaks, they should not exist. Let me know once you have managed to reproduce them (with the php errors, debug mode), or when you have a small script and an example zip.
 
PHP Copyright © 2001-2024 The PHP Group
All rights reserved.
Last updated: Sat Dec 21 16:01:28 2024 UTC