php.net |  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Bug #43242 Segmentation fault (11) with curl_multi
Submitted: 2007-11-11 05:25 UTC Modified: 2007-11-11 09:42 UTC
From: administrator at proxy-list dot org Assigned:
Status: Closed Package: cURL related
PHP Version: 5.2.4-2007-11-11 OS: FreeBSD
Private report: No CVE-ID: None
View Add Comment Developer Edit
Welcome! If you don't have a Git account, you can't do anything here.
You can add a comment by following this link or if you reported this bug, you can edit this bug over here.
(description)
Block user comment
Status: Assign to:
Package:
Bug Type:
Summary:
From: administrator at proxy-list dot org
New email:
PHP Version: OS:

 

 [2007-11-11 05:25 UTC] administrator at proxy-list dot org
Description:
------------
Script crashes on curl_multi_exec() when curl_multi is used under high 'pressure'. Namely, I have tried to download web pages (about 30.000) during single PHP script execution (it usually takes from 5 up to 20 minutes).

The complexity of problem identification is that every time script crashes after different amount of downloaded pages (at times after 500 at other times after 13.000).

p.s. I have done several tests on two different dedicated servers and one PC. The results could be presented upon request.

Reproduce code:
---------------
To see the script with all necessary test data: http://gesoft.org/curl-demo/curl_multi.rar

<?php
$mh = curl_multi_init();
// split all links into chunks
$chunks = array_chunk($links, $threadsAmount);
foreach ($chunks as $chunk) {
  // fill curl_multi with curl objects
  foreach ($chunk as $link) {
    if ($ch = CreateCURL(trim($link))) {
      $resultCode = curl_multi_add_handle($mh, $ch);
    }
  }
  // do execution until all pages will be downloaded
  do {
    $resultCode = curl_multi_exec($mh, $activeThreads);
  } while ($activeThreads > 0);
  // collect the result pages of each thread
  while ($info = curl_multi_info_read($mh)) {
    $HTMLSource = ($info['result'] == 0) ? curl_multi_getcontent($info['handle']) : 'CURL_ERROR: '.$info['result'];        
    curl_multi_remove_handle($mh, $info['handle']);
  }  
}
curl_multi_close($mh);
?>

Expected result:
----------------
Segmentation fault (11): core dumped

Actual result:
--------------
(gdb) bt
#0  0x2879fed0 in ?? ()
#1  0xbfbfcd66 in ?? ()
#2  0xbfbfcd67 in ?? ()
#3  0xbfbfcd67 in ?? ()
#4  0x00000000 in ?? ()
...
#327 0x00000000 in ?? ()
#328 0xbfbfe8fc in ?? ()
#329 0x28228cdd in ?? ()
#330 0x2855cb03 in ?? ()
#331 0x08057694 in set_recursion_limit ()

Patches

Add a Patch

Pull Requests

Add a Pull Request

History

AllCommentsChangesGit/SVN commitsRelated reports
 [2007-11-11 09:42 UTC] administrator at proxy-list dot org
I have just updated PHP to the latest 5.2.4 version and it looks like the problem solved. I have successfully downloaded 29.000 links without any interruption.
 
PHP Copyright © 2001-2024 The PHP Group
All rights reserved.
Last updated: Fri Apr 19 13:01:30 2024 UTC