php.net |  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Bug #52902 Recursive References Leak Memory
Submitted: 2010-09-21 21:20 UTC Modified: 2010-09-21 22:12 UTC
From: jcampbell at remindermedia dot com Assigned:
Status: Not a bug Package: Class/Object related
PHP Version: 5.3.3 OS: Fedora 12
Private report: No CVE-ID: None
 [2010-09-21 21:20 UTC] jcampbell at remindermedia dot com
Description:
------------
The behavior originally reported in #33595 still exists, namely, "Objects with recursive references leak memory."

Test script:
---------------
class A {
    function __construct () {
        $this->b = new B($this);
    }
}

class B {
    function __construct ($parent = NULL) {
        $this->parent = $parent;
    }
}

echo memory_get_usage() . "\n";

for ($i = 0 ; $i < 1000000 ; $i++) {
    $a = new A();
}

echo memory_get_usage() . "\n";

Expected result:
----------------
Memory usage should remain relatively constant. For example, if you change the one line to:
"$this->parent = clone $parent;"

Then the output is:
632216
632392

Actual result:
--------------
Memory usage increases with each newly instantiated object.

Actual output:
631976
1756792


Patches

Add a Patch

Pull Requests

Add a Pull Request

History

AllCommentsChangesGit/SVN commitsRelated reports
 [2010-09-21 21:32 UTC] cataphract@php.net
-Status: Open +Status: Bogus
 [2010-09-21 21:32 UTC] cataphract@php.net
How is this leaking memory? Only <2MB for 2 million objects (A+B)? An object takes more than 1 byte...

If you make the loop instantiate 10 times more objects, you see memory usage remains constant, so the garbage collection mechanism is definitely working here.
 [2010-09-21 21:57 UTC] jcampbell at remindermedia dot com
You're correct that it does form a cycle of memory usage going up and then back down. However, when using cloned objects, the memory usage is exactly the same after every loop. 

So if you make the objects larger so that a bunch of them don't fit under the memory limit but one does, then you'll run out of memory. Again, using cloned objects, you can loop indefinitely.

Is the solution to this to manually run gc_collect_cycles() when dealing with large objects? Because that workaround does seem to have the same effect on memory as cloning.
 [2010-09-21 22:12 UTC] cataphract@php.net
Your alternatives are

* recompile PHP with a smaller value of GC_ROOT_BUFFER_MAX_ENTRIES (by default, it's relatively high: 10 000)
* call gc_collect_cycles()
* increase the memory limit
* avoid circular references
 
PHP Copyright © 2001-2024 The PHP Group
All rights reserved.
Last updated: Sat Jun 01 16:01:31 2024 UTC