|
php.net | support | documentation | report a bug | advanced search | search howto | statistics | random bug | login |
PatchesPull RequestsHistoryAllCommentsChangesGit/SVN commits
[2001-10-08 16:12 UTC] jeroen@php.net
[2001-10-09 03:53 UTC] renze at datalink dot nl
[2001-10-09 15:48 UTC] jeroen@php.net
[2001-10-10 03:42 UTC] renze at datalink dot nl
[2001-10-10 03:58 UTC] jmoore@php.net
|
|||||||||||||||||||||||||||
Copyright © 2001-2025 The PHP GroupAll rights reserved. |
Last updated: Tue Oct 28 21:00:01 2025 UTC |
I'm trying to process a tab separated file. This file is about 1.5M large, contains 9040 lines that each contain 85 "columns". Very happy with the fgetcsv() function I started working on this. Then, when I finished, I didn't get my wanted requested output, but only errors in my logfile. "memory exhausted". So I started trying all kinds of stuff. The piece of script I use for the 'processing' is like: if (!($fd = fopen ($file, "r"))) { // Some error and exit } $data = array(); $total = array(); while ($data = fgetcsv ($fd, 580, $delimiter)) { array_push ($total, $data); } if (!fclose ($fd)) { // Some error and exit } Now I've configured Apache to log the peak memory usage of the script and guess what?! 28M!!! It's take up 28M to read a file of hardly 1.5M! Really weird. So I tried something else. $contents = file ($file); No problem! Memory usage: 2M. So, I thought, maybe there's some bug in fgetcsv(). Let's try it myself: $contents = file($file); $total = array(); $record = array(); foreach ($contents as $line) { $record = split ("\t", $line); array_push ($total, $record); } Memory usage: 28M. So that's not the solution. So... it might just be that array_push doesn't work correct. Next try: if (!($fd = fopen ($file, "r"))) { // Some error and exit } $data = array(); $total = array(); $counter = 0; while ($data = fgetcsv ($fd, 580, $delimiter)) { $total[$counter++] = $data; } if (!fclose ($fd)) { // Some error and exit } Memory usage: 24.7M So that's no solution neither. Then I've tried the following change to the while-loop. while ($total[] = fgetcsv ($fd, 580, $delimiter)) { } No luck! Same memory usage. Okay... well... that one was predictable. So... Then I came up with the following very nasty solution: if (!($fd = fopen ($file, "r"))) { // Some error and exit } $data = array(); $counter = 0; while ($data = fgetcsv ($fd, 580, "r")) { $var_name = "myVar_$counter"; $$var_name = $data; $counter++; } if (!close ($fd)) { // Some error and exit } Nasty ain't it?! Didn't work either. Memory usage: 24.8M. So quickly threw that one out again :) I've tried several combinations of the above constructions. None worked correctly. Memory usage way to high! Some other experiment: if (!($fd = fopen ($file, "r"))) { // Some error and exit } $data = array(); $total = array(); while ($data = fgetcsv ($fd, 580, $delimiter)) { array_push ($total, implode ("\t", $data)); } if (!fclose ($fd)) { // Some error and exit } Yep. That worked. Memory usage: 2M. But, well... it's the same result as just: $total = file ($file). That's not the way! So... well... I've tried about everything, but everything that produces the correct result (a 2D array with all the 'records' in it) also produces a memory usage that way to high. Anyone knows where this bug comes from? *R&zE: Btw... PHP version 4.0.8-dev: --prefix=/usr/local/php --with-config-file-path=/usr/local/php/etc --with-exec-dir=/usr/local/php/safe --with-apxs=/usr/local/Apache/bin/apxs --without-mysql --with-solid=/home/solid --with-pgsql=/usr/local/pgsql --with-pdflib=/usr/local --with-db3 --enable-ftp --with-mm --with-zlib --with-bz2 --with-openssl --with-gd --enable-gd-native-ttf --with-jpeg-dir --with-png-dir=/usr --with-zlib-dir=/usr --with-xpm-dir=/usr/X11R6 --with-ttf --with-t1lib --with-pcre-regex --enable-sysvsem --enable-sysvshm --enable-memory-limit --enable-inline-optimization --enable-versioning Apache version 1.3.14