php.net |  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Bug #35005 Opening a lot of files result in no network connectivity
Submitted: 2005-10-27 22:31 UTC Modified: 2005-11-20 19:26 UTC
Votes:3
Avg. Score:5.0 ± 0.0
Reproduced:3 of 3 (100.0%)
Same Version:2 (66.7%)
Same OS:2 (66.7%)
From: daniel at polkabrothers dot com Assigned:
Status: Not a bug Package: Network related
PHP Version: 5CVS-2005-10-31 (snap) OS: Mac OS X 10.4.2
Private report: No CVE-ID: None
 [2005-10-27 22:31 UTC] daniel at polkabrothers dot com
Description:
------------
When opening a lot (3000 in this case) files under Mac OS X, 
network connectivity disappears.

This has been tested under Linux 2.6, and works fine.

Reproduce code:
---------------
$fp = array();
for($x=0;$x<3000;$x++) {
$fp[$x] = fopen("/tmp/$x", "w");
}

$url_fp = fopen("http://www.google.com", "r");
var_dump(fread($url_fp, 1500));

Expected result:
----------------
To get the first 1500 bytes from www.google.com

Actual result:
--------------
string(0) ""

Patches

Pull Requests

History

AllCommentsChangesGit/SVN commitsRelated reports
 [2005-10-27 22:50 UTC] daniel at polkabrothers dot com
Have now done a bit more testing, and it only happens if you 
try to open more than 1017 files and then try to open a url.

Have tried opening urls with fopen(), curl_* and exec
("wget"). Same end-result, they don't connect.

PHP doesn't generate any error messages when trying to open 
using fopen(). When trying it with the curl functions, curl 
returns with "couldn't connect" but if you turn on more 
debugging it comes back with "Unknown error: 0". When trying 
to exec() wget it stops as soon as it gets a connection and 
is about to output "200 OK"

(i have read the how to report bugs, but can't find what i'm 
missing to include)
 [2005-10-27 22:56 UTC] tony2001@php.net
Looks like MacOSX has max number of file descriptors set to 1024 or something like that.
I don't have MacOSX around here, but I guess this fact should be documented somewhere @ apple.com.
Could you check it?
 [2005-10-27 23:06 UTC] daniel at polkabrothers dot com
I used ulimit -n to increase the number of allowed open 
files, otherwise it wouldn't even allow me to create 3000 
files.

Now ulimit -a gives me:

core file size        (blocks, -c) 0
data seg size         (kbytes, -d) 6144
file size             (blocks, -f) unlimited
max locked memory     (kbytes, -l) unlimited
max memory size       (kbytes, -m) unlimited
open files                    (-n) 10240
pipe size          (512 bytes, -p) 1
stack size            (kbytes, -s) 8192
cpu time             (seconds, -t) unlimited
max user processes            (-u) 100
virtual memory        (kbytes, -v) unlimited

Can't find anything else which relates to file descriptors 
and Mac OS X.
 [2005-10-27 23:17 UTC] daniel at polkabrothers dot com
I should probably add that I've tried running this both as 
root (su root; ulimit -n 5000) and using sudo (sudo php ...).

Same result.
 [2005-11-20 19:26 UTC] sniper@php.net
Works fine for me. Just increase your limits and it will work.

 
PHP Copyright © 2001-2024 The PHP Group
All rights reserved.
Last updated: Sat Dec 21 14:01:32 2024 UTC