php.net |  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Bug #32829 No garbage collection when iterating large loops
Submitted: 2005-04-26 02:07 UTC Modified: 2005-04-27 14:36 UTC
Votes:1
Avg. Score:5.0 ± 0.0
Reproduced:1 of 1 (100.0%)
Same Version:1 (100.0%)
Same OS:1 (100.0%)
From: fransson at fransson dot de Assigned:
Status: Not a bug Package: Performance problem
PHP Version: 4.3.11 OS: Windows XP
Private report: No CVE-ID: None
Welcome back! If you're the original bug submitter, here's where you can edit the bug or add additional notes.
If you forgot your password, you can retrieve your password here.
Password:
Status:
Package:
Bug Type:
Summary:
From: fransson at fransson dot de
New email:
PHP Version: OS:

 

 [2005-04-26 02:07 UTC] fransson at fransson dot de
Description:
------------
PHP runs out of memory when iterating large loops. It seems that it does not free unused memory. The Windows Task Manager indicates that it tries to allocate more than 2 Gigabytes of memory (on my P4 with 1 Gigabyte) before crashing with an access violation.

In my opinion this is a performance problem in the first place.  The crash does help in this case because the machine becomes usable again.

Although I guess it is not, this problem might be related to the MySQL interface because the loops which are causing my troubles are of the following structure:

while ($row = mysql_fetch_array($set, MYSQL_ASSOC)) {
  ...
}

I guess that the data being stored in the variable $row with each iteration is never being freed up by php, although it can never be referenced again after the next iteration has overwritten the old reference.

To be more explicit:
1. Before the execution of the while-loop, $row does not even exist and therefore does not consume any memory.
2. During the execution of the first iteration of the while-loop, $row consumes only enough memory to hold a single row of the result set (which might be around 20 KB in my case).
3. During the execution of the second iteration, $row is being overwritten with a new row of the result set. Its previous value is not accessible any more, but nevertheless consumes memory. Therefore, assuming a memory consumption of about 20 KB per row, the total memory consumption during the second iteration is 40 KB. 20 KB of these are completely wasted since they remain inaccessible to the application.
4. The larger the result set, the larger the memory wasted. If it is large enough, it will slow down and crash any machine. In my case, the result set holds about 50,000 records.



Patches

Pull Requests

History

AllCommentsChangesGit/SVN commitsRelated reports
 [2005-04-26 15:23 UTC] fransson at fransson dot de
The following script will kill almost any machine
as it consumes enormous amounts of memory (more than 1 Gigabyte). If your machine still survives, just append one or more zeros to the $mobySize value ;-)
Memory consumption should not occur since this script does not allocate memory. It seems there is a memory leak associated with PHP's MySQL interface.

<?php
  set_time_limit(3600);

  $lnk = mysql_connect("localhost", "root", "");
  if ($lnk <= 0) { die("Cannot connect to database!"); }
  $ok = mysql_select_db("test", $lnk);
  if (!$ok) { die("Cannot select database!"); }

  $ok = mysql_query("DROP TABLE IF EXISTS moby");
  $ok = mysql_query("CREATE TABLE moby ("
    . "MobyID int(11) unsigned NOT NULL auto_increment,"
    . "MobyNumber int(11) unsigned NOT NULL default '0',"
    . "MobyText mediumtext,"
    . "PRIMARY KEY  (MobyID)"
    . ") ENGINE=MyISAM DEFAULT CHARSET=latin1");
  if (!$ok) { die("Cannot create table!"); }

  $mobySize = 1000000;
  for ($i = 0; $i < $mobySize; $i++) {
    $text = "";
    for ($j = 0; $j < 100; $j++) { $text .= "Moby" . rand(); }
    $ok = mysql_query("INSERT INTO moby (MobyNumber, MobyText) VALUES (" . rand() . ", '" . $text . "')");
    if (!$ok) { die("Cannot insert record!"); }
  }

  $set = mysql_query("SELECT * FROM moby");
  $i = 0;
  while ($row = mysql_fetch_array($set, MYSQL_ASSOC)) {
    $value = $row["MobyNumber"];
    $i++;
    if (($i % 100000) == 0) {
      // print("Memory usage: " . number_format(memory_get_usage(), 0, ",", ".") . " Bytes<br>");
      print("Memory usage: memory_get_usage() is not defined on windows!<br>");
      print("MobyNumber: " . $value . "<br>");
      print("MobyText: " . $text . "<br>");
    }
    if ($value == $row) { print("Never."); }
  }
?>
 [2005-04-27 14:36 UTC] georg@php.net
Thank you for taking the time to write to us, but this is not
a bug. Please double-check the documentation available at
http://www.php.net/manual/ and the instructions on how to report
a bug at http://bugs.php.net/how-to-report.php

If you have to retrieve large number of rows you should 
use mysql_unbuffered_query. Otherwise libmysql will 
allocate memory for all retrieved rows. 
 
PHP Copyright © 2001-2024 The PHP Group
All rights reserved.
Last updated: Fri Dec 27 05:01:27 2024 UTC