|  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Bug #31515 scandir() is slower than user-function
Submitted: 2005-01-12 13:48 UTC Modified: 2005-02-22 01:24 UTC
From: akorthaus at web dot de Assigned:
Status: Closed Package: Performance problem
PHP Version: 5.0.3 OS: Linux 2.4.28 (Gentoo)
Private report: No CVE-ID: None
 [2005-01-12 13:48 UTC] akorthaus at web dot de
I do not understand why the new scandir() function is slower than an own PHP-function which does the same (I used the "Example 2. PHP 4 alternatives to scandir()" from manual).

I tried this with 50 - 100.000 files, but the result is allways the same. 

my_scandir() is about 50%-100% faster. If I don't sort, it is about 400% faster.

Reproduce code:
function my_scandir($dir) {
    $dh  = opendir($dir);
    while (false !== ($filename = readdir($dh))) {
        $files[] = $filename;
    return $files;
$t1= microtime(TRUE);
$files = my_scandir('/tmp');
$t2= microtime(TRUE);
echo "count: ".count($files)."\n";
echo $t2-$t1;
echo "\n";

$t1 = microtime(TRUE);
$files = scandir('/tmp');
$t2= microtime(TRUE);
echo "count: ".count($files)."\n";
echo $t2-$t1;
echo "\n";

Expected result:
I expect the c-function to be faster

Actual result:
the php-function is about 50-100% faster


Add a Patch

Pull Requests

Add a Pull Request


AllCommentsChangesGit/SVN commitsRelated reports
 [2005-01-12 21:51 UTC]
count: 2034                  
count: 2034                  

Only difference:
foreach(range(1, 5000) as $unused)
    $files = scandir('C:\WINDOWS\System32');

So, not on Win32. Do a foreach like I have done and spread the function call over quite a few calls, because with repeated execution of a single function call, it went back and forth for me.
 [2005-01-12 23:59 UTC] akorthaus at web dot de
With a small directory I get:

count: 71

count: 71

With 100.000 files it takes too long, and scandir() runs into memory_limit (which is 500 MB!)

scandir() seems to need much more memory!
I added the following line to the scripts:
echo "mem: ".number_format(memory_get_usage()/1048576) . "M\n";

so I get:

mem: 10M
count: 100002

mem: 397M
count: 100002

If I put in (scandir version):

foreach(range(1, 2) as $unused)

I get:

Fatal error: Allowed memory size of 524288000 bytes exhausted (tried to allocate 4096 bytes) in /home/akorthaus/test/scan.php on line 5

If I put in (my_scandir version):

foreach(range(1, 10) as $unused)

mem: 10M
count: 100002

which is the same as with only one cycle.
 [2005-01-13 02:10 UTC]
Please try using this CVS snapshot:
For Windows:

That's amazing. Try 5.0.4-dev.
 [2005-01-13 03:09 UTC] akorthaus at web dot de
I tried php5-STABLE-200501122330:

./configure \
  --prefix=/home/akorthaus/bin/php5-STABLE-200501122330 \
  --disable-all \
  --with-pcre-regex \

With the following results:

scandir (foreach:500, files:527)
mem: 2M
time: 10.242558956146s

my_scandir (foreach:500, files:527)
mem: 0M
time: 2.3772580623627s

scandir (foreach:1, files:10000)
mem: 40M
time: 0.40674495697021s

my_scandir (foreach:1, files:10000)
mem: 1M
time: 0.17293095588684s

scandir (foreach:100, files:10000)
mem: 40M
time: 41.659919977188s

my_scandir (foreach:100, files:10000)
mem: 1M 
time: 20.631703853607s
 [2005-01-13 03:43 UTC] akorthaus at web dot de
the same with php5-STABLE-200501130130
 [2005-01-14 11:39 UTC] akorthaus at web dot de
I tried php5-STABLE-200501140930 with the same result

The size of the directory-listing ("files"):

number of files:
ls -1 files | wc -l

Number of bytes:
ls -1 files | wc -c
 [2005-02-22 01:24 UTC]
This bug has been fixed in CVS.

Snapshots of the sources are packaged every three hours; this change
will be in the next snapshot. You can grab the snapshot at
Thank you for the report, and for helping us make PHP better.

PHP Copyright © 2001-2019 The PHP Group
All rights reserved.
Last updated: Wed Jun 26 00:01:02 2019 UTC