|  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Bug #62409 Using cURL multi to save urls to a file then reading files - files chopped
Submitted: 2012-06-25 15:41 UTC Modified: 2012-09-28 22:25 UTC
From: emmet at trovit dot com Assigned:
Status: Not a bug Package: cURL related
PHP Version: 5.3.14 OS: Mac OS X 10.6.8
Private report: No CVE-ID: None
Welcome back! If you're the original bug submitter, here's where you can edit the bug or add additional notes.
If this is not your bug, you can add a comment by following this link.
If this is your bug, but you forgot your password, you can retrieve your password here.
Bug Type:
From: emmet at trovit dot com
New email:
PHP Version: OS:


 [2012-06-25 15:41 UTC] emmet at trovit dot com
I'm using curl multi to download files in parallel. In the test script I specify that I want to put the contents of the download in a file using the CURLOPT_FILE flag. The file is downloaded correctly, but when I try to file_get_contents or fopen() on the downloaded file the contents are always chopped off. If I put the contents of the file in a variable and count the length of the string it is always 40960 characters max. The file itself downloads entirely to disk, it's just PHP which won't read it all, and limits it to 40Kb.

Tested with PHP 5.3.8, 5.3.13 and 5.3.14, the same thing happens with all versions.

Output of php -v:
PHP 5.3.8 (cli) (built: Dec  5 2011 21:24:09) 
Copyright (c) 1997-2011 The PHP Group
Zend Engine v2.3.0, Copyright (c) 1998-2011 Zend Technologies
    with Xdebug v2.1.2, Copyright (c) 2002-2011, by Derick Rethans

PHP 5.3.13 (cli) (built: May  9 2012 07:21:29) 
Copyright (c) 1997-2012 The PHP Group
Zend Engine v2.3.0, Copyright (c) 1998-2012 Zend Technologies
    with Xdebug v2.1.3, Copyright (c) 2002-2012, by Derick Rethans

PHP 5.3.14 (cli) (built: Jun 25 2012 16:47:44) 
Copyright (c) 1997-2012 The PHP Group
Zend Engine v2.3.0, Copyright (c) 1998-2012 Zend Technologies

This bug not quite the same as this one but they could be related.

Test script:
I've put the code on pastebin, I can upload it again or email it if needed:

Expected result:
The script should echo out the entire contents of the newly downloaded webpage (which has been put in a local file).

Actual result:
Actual result is the first 40960 characters are output. Checking the file manually with a tail -f will show that the file was downloaded entirely, but PHP won't display it all.


Add a Patch

Pull Requests

Add a Pull Request


AllCommentsChangesGit/SVN commitsRelated reports
 [2012-06-25 16:43 UTC] emmet at trovit dot com
Turns out that everything works as expected when you close the original filehandle just before calling file_get_contents() on the same file. So this is not a bug, more of an 'undocumented misbehaviour'.

I've pasted the code which works here:
 [2012-09-28 22:25 UTC]
-Status: Open +Status: Not a bug
 [2012-09-28 22:25 UTC]
Thank you for taking the time to write to us, but this is not
a bug. Please double-check the documentation available at and the instructions on how to report
a bug at

This is not a bug. As noticed you need to fclose the file handle before trying to 
read the content of the file.
PHP Copyright © 2001-2023 The PHP Group
All rights reserved.
Last updated: Tue Nov 28 10:01:26 2023 UTC