|
php.net | support | documentation | report a bug | advanced search | search howto | statistics | random bug | login |
PatchesPull RequestsHistoryAllCommentsChangesGit/SVN commits
[2005-04-18 08:42 UTC] derick@php.net
[2005-04-19 01:31 UTC] pvandijk at gmail dot com
|
|||||||||||||||||||||||||||
Copyright © 2001-2025 The PHP GroupAll rights reserved. |
Last updated: Thu Oct 23 00:00:02 2025 UTC |
Description: ------------ Hashes with a large number of keys and large key data size seem to cause memory corruption, which in turn causes php to either segfault or hang upon exiting (depending on the context of the code). I've heard mention that arrays are not unlimited in size. This issue seems to occur at about 65535 elements in my tests, but also depends on the size of the keys. Presumably this is because i'm indexing my arrays with Strings, and therefore it's running out of memory faster. If the memory limit of a hash is reached, should it not be handled more gracefully than corrupting memory, which results in a segfault? The code example i've provided seems to reproduce a crash under both linux and windows, php 4.3.11 Reproduce code: --------------- <?php $data = 'hello, i like cheese'; $ar = array(); for($i = 1000000; $i < 3000000 ; $i++) { $key = 'abc'.$i; $ar[$key] = $data; } function check($ar) { global $data; foreach($ar as $k => $value) { if($data != $value) { print 'invalid value: '. $k .' => '. $value ."\r\n"; } } } check($ar); print 'done.'."\r\n"; ?> Expected result: ---------------- done. Actual result: -------------- "done. Segmentation fault" sorry i cant provide a backtrace or any further info, i dont have access to these tools on my current dev server.