|  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Bug #38620 curl_multi_* - timeout after 4-6 socket connects on different hosts
Submitted: 2006-08-28 01:08 UTC Modified: 2006-09-14 01:00 UTC
Avg. Score:4.6 ± 0.7
Reproduced:6 of 6 (100.0%)
Same Version:4 (66.7%)
Same OS:6 (100.0%)
From: Assigned:
Status: No Feedback Package: cURL related
PHP Version: 5.1.6 OS: Debian Sarge
Private report: No CVE-ID: None
View Add Comment Developer Edit
Welcome! If you don't have a Git account, you can't do anything here.
You can add a comment by following this link or if you reported this bug, you can edit this bug over here.
Block user comment
Status: Assign to:
Bug Type:
New email:
PHP Version: OS:


 [2006-08-28 01:08 UTC]
try the example code on the function.curl-multi-exec page in the manual with 20 different hosts (cnn, yahoo, blogspot,, etc.)
it always returns 4 max. 6 sockets every after these connects  giving a timeout.

on the same host you ca force every number of sockets you want but on different you get a timeout after max. 6 sockets.

(i tried it on 3 different machines each hosted on a different provider)

Reproduce code:
see the example code in the comments on the function.curl-multi-exec page in the manual.

Expected result:
first 4-6 sockets get a page every socket after the first 4-6 bring back a timeout.


Add a Patch

Pull Requests

Add a Pull Request


AllCommentsChangesGit/SVN commitsRelated reports
 [2006-08-28 06:47 UTC]
And why do you think it's a PHP problem?
 [2006-09-01 02:41 UTC]
because i tested it on 3 different systems each on a different provider.
always the same result, in every curl_multi_init the first 4 inits work the rest gets a timeout.

to get sure its not a resolving limit or something i put up a test script that i started 20 times at the same time with a small bash script an screen. the 20 scripts using only 4 sockets per perl init. but they working perfectly.
makes sure that theres no limit at the provider or on the machines for resolving domains or whatever.
so there must be a problem in the curl_multi functions... dont know where or what but it is a bug.

 [2006-09-01 02:45 UTC]
just try it youself. 
take the example script and add ~20 urls instead of just 3.
and run that in a loop. you will see the first 4, max. 6 connects working and getting a result back. every socket per loop after the 6 working sockets just replying a timeout.
 [2006-09-01 03:18 UTC]
okay, i tried it with the last version (5.1.6) and it does the same... 5/6 connects working the rest brings up a timeout.
 [2006-09-01 09:37 UTC]
And it's not a problem of cURL because .... ?
 [2006-09-02 14:14 UTC]
because curl_multi is very buggy...
give me an example that it is curl... but i dont think so.
 [2006-09-02 16:12 UTC]
Set a reasonable timeout option then.
 [2006-09-02 16:31 UTC]
you can set it to 200 seconds, it dont change the result.
 [2006-09-02 17:42 UTC]
There's no example code on the manual page.

Did you only set the transfer timeout or the connect timeout option too?

 [2006-09-02 18:10 UTC]
see in the manual on the, there is the example code.

trust me, i tested it very hard with everything i know.
because i need this function... so the only workaround for me is to code a cpp appl, make a perl script or something else but you cant take php for this. and this sucks.
because the function is there but you cant use it.

i mean it should work. why is there a function that doesnt work... drives me nuts ;)
 [2006-09-02 22:45 UTC]
just btw. the multi interface of libcurl works fine with perl, c (and cpp).
 [2006-09-03 09:20 UTC]
Complete reproduce code?  The multi interface also works fine with pecl/http.
 [2006-09-06 07:54 UTC]

$u[] =  '';
$u[] =  '';
$u[] =  '';
$u[] =  '';
$u[] =  '';
$u[] =  '';
$u[] =  '';
$u[] =  '';
$u[] =  '';
$u[] =  '';
$u[] =  '';
$u[] =  '';

# uncomment this to test with a single host (should work with no timeouts in a snap)
# $u = array_fill( 0, 20, '' );

$mh = curl_multi_init();

foreach( $u as $i => $d ) {
        $conn[$i] = curl_init( 'http://'.$d );
        curl_setopt( $conn[$i], CURLOPT_RETURNTRANSFER, 1 );
        curl_setopt( $conn[$i], CURLOPT_FOLLOWLOCATION, 1 );//follow redirects
        curl_setopt( $conn[$i], CURLOPT_MAXREDIRS, 1 );//maximum redirects
        curl_setopt( $conn[$i], CURLOPT_USERAGENT, 'Mediapartners-Google/2.1' );
        curl_setopt( $conn[$i], CURLOPT_TIMEOUT, 20 );
        curl_multi_add_handle( $mh, $conn[$i] );

do {
        $mrc = curl_multi_exec( $mh, $active );
} while( $mrc == CURLM_CALL_MULTI_PERFORM );

while( $active and $mrc == CURLM_OK ) {
        if( curl_multi_select( $mh ) != -1 ) {
                do {
                        $mrc = curl_multi_exec ($mh, $active );
                } while( $mrc == CURLM_CALL_MULTI_PERFORM );

if( $mrc != CURLM_OK ) {
        print "Curl multi read error $mrc\n";

foreach( $u as $i => $d ) {
        echo $i.' - '.$d."\n";
        if( ( $err = curl_error( $conn[$i] ) ) == '' ) {
                echo 'found something';
        else {
                echo $err."\nnoconnect";
        echo "\n--\n";
        curl_multi_remove_handle( $mh, $conn[$i] );
        curl_close( $conn[$i] );

curl_multi_close( $mh );

 [2006-09-06 08:13 UTC]
Please try using this CVS snapshot:
For Windows:

 [2006-09-06 12:39 UTC]
You only set the overall time out; you should also try to set a reasonable high connect time out.
 [2006-09-14 01:00 UTC] php-bugs at lists dot php dot net
No feedback was provided for this bug for over a week, so it is
being suspended automatically. If you are able to provide the
information that was originally requested, please do so and change
the status of the bug back to "Open".
PHP Copyright © 2001-2024 The PHP Group
All rights reserved.
Last updated: Tue Jul 23 18:01:30 2024 UTC