|  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Bug #25093 no way to free failed query => memory leak
Submitted: 2003-08-14 06:51 UTC Modified: 2003-08-15 07:05 UTC
From: php at pv2c dot sk Assigned:
Status: Closed Package: PostgreSQL related
PHP Version: 4.3.2 OS: Linux
Private report: No CVE-ID: None
 [2003-08-14 06:51 UTC] php at pv2c dot sk
pg_query doesn't return resource for failed queries - that's not very wise, IMHO (see related bug 18747), but the real problem with this is, that you cannot free failed results.

It may not be noticeable if you have only a few failed queries, but it becomes a serious problem if you have lots. Try the example code.

Reproduce code:
// assume one table "aaa" with one column "test", that is
// unique (primary key maybe)


for($t=0; $t<10000; $t++)
  $ret = pg_query($con, "INSERT INTO aaa (test) VALUES 1");
  // $ret is FALSE (cannot inset duplicate value) => no way to free it

Expected result:
Some way to free the result resource...

Actual result:
PHP memory consumption grows *really fast*, in my case it even ignores memory_limit setting in php.ini.


Add a Patch

Pull Requests

Add a Pull Request


AllCommentsChangesGit/SVN commitsRelated reports
 [2003-08-14 06:53 UTC] php at pv2c dot sk
Sorry :), correct SQL in pg_query should be:
"INSERT INTO aaa (test) VALUES (1);"
 [2003-08-15 07:05 UTC]
This bug has been fixed in CVS.

In case this was a PHP problem, snapshots of the sources are packaged
every three hours; this change will be in the next snapshot. You can
grab the snapshot at
In case this was a documentation problem, the fix will show up soon at

In case this was a website problem, the change will show
up on the site and on the mirror sites in short time.
Thank you for the report, and for helping us make PHP better.

PHP Copyright © 2001-2024 The PHP Group
All rights reserved.
Last updated: Tue Apr 16 11:01:29 2024 UTC