php.net |  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Bug #74304 Can't "upgrade" from mysql_* functions due to memory allocation problems
Submitted: 2017-03-24 02:02 UTC Modified: 2020-06-14 04:22 UTC
Votes:1
Avg. Score:3.0 ± 0.0
Reproduced:0 of 0 (0.0%)
From: josh at endries dot org Assigned: cmb (profile)
Status: No Feedback Package: *Database Functions
PHP Version: 5.6.30 OS: FreeBSD
Private report: No CVE-ID: None
Have you experienced this issue?
Rate the importance of this bug to you:

 [2017-03-24 02:02 UTC] josh at endries dot org
Description:
------------
I am trying to update an old application that uses mysql_* functions but I am running into memory problems when I really shouldn't.

I've tried mysqli (OOP) and PDO, buffered and unbuffered queries, USE_RESULT, free()/close()/=null, and everything else I can think of or find online. I hope I missed something that will work. There is a text column in a join in some queries, the actual data in that field is ~30 KB (~13k chars for the longest row). The result sets (rows) the site is fetching are small. However, it does do a large number of queries (500+).

The current site's Apache processes (using mod_php) never go above 50 MB of memory usage when loading the test page. It inflates by 5-15 MB or so while loading, then drops back down again. This is with the mysql_* functions. With PDO or mysqli (with MYSQLI_USE_RESULT), it increases quickly to the memory_limit and dies. I had a 128 MB limit but raising that to 256 MB made no difference--it hit the limit in a few seconds.

With a 768 MB limit, all three types transfer about 2 MB (inbound/download) over the wire, however with a 128 MB limit, for some reason PDO transferred a whopping 45 MB over the wire. No idea what happened there.

With a 768 MB limit, using PDO and watching vmstat, it uses 340 MB or more to load the test page in ~5.25sec.

With mysqli with MYSQLI_USE_RESULT, 380 MB used and a 4.75sec load time.
With mysqli without MYSQLI_USE_RESULT, it was almost the same.

With mysql_*, 20 MB used and a 4.25sec load time.

This is system memory usage, so I would think whether you buffer in C or PHP wouldn't matter much, as mysqli seems to indicate. However, there is a staggeringly large difference (I'm guessing) between libmysqlclient (or whatever mysql_* uses) and mysqlnd, and presumably different code paths within PHP.

The DB layer should only be creating one instance of mysqli or PDO. The queries haven't changed at all, and the DB layer assigns the full result set to an array and that array to a result object before returning the object to the code that is making the query. I've tried looping through with fetch() or its equivalents and I've tried simply assigning the return of fetchAll() or its equivalents and neither made a difference. For example:

	$data = Array();
	$i = 0;
	while ($r = $result->fetch(PDO::FETCH_ASSOC)) {
		$data[$i] = $r;
		$i++;
	}
	$query = new stdClass();
	$query->num_rows = $i;
	$query->row = isset($data[0]) ? $data[0] : Array();
	$query->rows = $data;
	$result = null;
	return $query;

Only the while statement is different between the three DB engines (except for e.g. fetchAll). I would expect similar memory usage in all cases given this design, since it always buffers everything into a PHP object/array one way or another, or even twice as much if it's copying data, but it is somehow seemingly using ten times the amount of memory (or more) with mysqli or PDO (or mysqlnd) over mysql_*.

This behavior seems like a bug to me, or GC isn't happening, or...something is definitely wrong. I know the actual data being queried is small. Even using the recommendations on dealing with large result sets doesn't work.

At this point I have no idea what to do. I can't feasibly rewrite all the queries to use enumerators (though I don't know if that would actually help), and in many cases I can't do that anyway because it needs the whole result set.



Patches

Add a Patch

Pull Requests

Add a Pull Request

History

AllCommentsChangesGit/SVN commitsRelated reports
 [2017-03-24 08:44 UTC] spam2 at rhsoft dot net
i can assure you that this is not normal - i replaced mysql_ to mysqli_ in our database layer many years ago and never saw such issues with 5.3, 5.4, 5.5, 5.6, 7.0 or 7.1 - sounds like you hit a memory leak

maybe you can look at https://bugs.php.net/bug.php?id=73889 where my problem is that i get the valrgind leak but i am unable to isolate which of the thousands automated requests and which piece of code triggers it

you need a debug build and valgrind
 [2017-03-25 00:50 UTC] josh at endries dot org
Update:

I changed all text columns used in the index queries to varchar but it made no difference. A number of these queries do SELECT * unfortunately.

I tried calling GC after each "$result = null;" at the end of the PDO query() call but it made no difference, except taking way longer (I figured, what the heck, see what happens, maybe it's due to the number of queries).

I'm using the binary package currently. I'll look into making a debug port build, not sure how to do that (the debug part, not the port part).

If there are any other ideas on things to try please let me know. THanks
 [2017-03-25 01:05 UTC] nikic@php.net
Are you able to reproduce the increased memory usage in a self-contained script, or do you see this effect only as part of a larger application?
 [2017-03-25 02:15 UTC] josh at endries dot org
Ah, thanks for asking, I thought about something similar this morning but forgot. I just logged, copied and pasted the queries it runs into a single file and it worked, e.g.:

foreach ($queries as $q) { $r = $db->query($q); /* ... */ }

It's difficult to replicate the actual processing in one file, but looping over an array of the queries and running them, then counting $r->rows, works quickly without much memory usage.

I'm not really sure what this might mean, if anything. It doesn't model the actual processing, which apparently makes a difference. I can look into doing that, but it will probably be tough. Maybe I can just squish all the files into one big one...there are dozens of classes, at least.

I don't really understand how the app could somehow cause the memory issues within its processing when using PDO but not mysql_* when it just deals with the returned object. I mean I would think it would have the same memory issues regardless of the driver if the problem were the app's usage of the result objects. Hmmm.
 [2017-03-27 17:29 UTC] nikic@php.net
Which PHP version did you try this on? If on 5.6, I's suggest testing this on PHP 7 as well. I remember fixing a number of leaks, at least in PDO, relating either to error conditions or circular references. Maybe you hit a problem that has already been resolved.
 [2020-06-03 16:12 UTC] cmb@php.net
-Status: Open +Status: Feedback -Assigned To: +Assigned To: cmb
 [2020-06-03 16:12 UTC] cmb@php.net
Do you still experience this issue with any of the actively
supported PHP versions[1]?

[1] <https://www.php.net/supported-versions.php>
 [2020-06-14 04:22 UTC] php-bugs at lists dot php dot net
No feedback was provided. The bug is being suspended because
we assume that you are no longer experiencing the problem.
If this is not the case and you are able to provide the
information that was requested earlier, please do so and
change the status of the bug back to "Re-Opened". Thank you.
 
PHP Copyright © 2001-2024 The PHP Group
All rights reserved.
Last updated: Tue Apr 16 08:01:32 2024 UTC