php.net |  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Bug #62409 Using cURL multi to save urls to a file then reading files - files chopped
Submitted: 2012-06-25 15:41 UTC Modified: 2012-09-28 22:25 UTC
From: emmet at trovit dot com Assigned:
Status: Not a bug Package: cURL related
PHP Version: 5.3.14 OS: Mac OS X 10.6.8
Private report: No CVE-ID: None
View Developer Edit
Welcome! If you don't have a Git account, you can't do anything here.
If you reported this bug, you can edit this bug over here.
(description)
Block user comment
Status: Assign to:
Package:
Bug Type:
Summary:
From: emmet at trovit dot com
New email:
PHP Version: OS:

 

 [2012-06-25 15:41 UTC] emmet at trovit dot com
Description:
------------
I'm using curl multi to download files in parallel. In the test script I specify that I want to put the contents of the download in a file using the CURLOPT_FILE flag. The file is downloaded correctly, but when I try to file_get_contents or fopen() on the downloaded file the contents are always chopped off. If I put the contents of the file in a variable and count the length of the string it is always 40960 characters max. The file itself downloads entirely to disk, it's just PHP which won't read it all, and limits it to 40Kb.

Tested with PHP 5.3.8, 5.3.13 and 5.3.14, the same thing happens with all versions.

Output of php -v:
PHP 5.3.8 (cli) (built: Dec  5 2011 21:24:09) 
Copyright (c) 1997-2011 The PHP Group
Zend Engine v2.3.0, Copyright (c) 1998-2011 Zend Technologies
    with Xdebug v2.1.2, Copyright (c) 2002-2011, by Derick Rethans

PHP 5.3.13 (cli) (built: May  9 2012 07:21:29) 
Copyright (c) 1997-2012 The PHP Group
Zend Engine v2.3.0, Copyright (c) 1998-2012 Zend Technologies
    with Xdebug v2.1.3, Copyright (c) 2002-2012, by Derick Rethans

PHP 5.3.14 (cli) (built: Jun 25 2012 16:47:44) 
Copyright (c) 1997-2012 The PHP Group
Zend Engine v2.3.0, Copyright (c) 1998-2012 Zend Technologies

This bug not quite the same as this one https://bugs.php.net/bug.php?id=52558 but they could be related.

Test script:
---------------
I've put the code on pastebin, I can upload it again or email it if needed:
http://pastebin.com/mMHVK85Z

Expected result:
----------------
The script should echo out the entire contents of the newly downloaded webpage (which has been put in a local file).

Actual result:
--------------
Actual result is the first 40960 characters are output. Checking the file manually with a tail -f will show that the file was downloaded entirely, but PHP won't display it all.

Patches

Pull Requests

History

AllCommentsChangesGit/SVN commitsRelated reports
 [2012-06-25 16:43 UTC] emmet at trovit dot com
Turns out that everything works as expected when you close the original filehandle just before calling file_get_contents() on the same file. So this is not a bug, more of an 'undocumented misbehaviour'.

I've pasted the code which works here:
http://pastebin.com/kzVMAjdJ
 [2012-09-28 22:25 UTC] pierrick@php.net
-Status: Open +Status: Not a bug
 [2012-09-28 22:25 UTC] pierrick@php.net
Thank you for taking the time to write to us, but this is not
a bug. Please double-check the documentation available at
http://www.php.net/manual/ and the instructions on how to report
a bug at http://bugs.php.net/how-to-report.php

This is not a bug. As noticed you need to fclose the file handle before trying to 
read the content of the file.
 
PHP Copyright © 2001-2024 The PHP Group
All rights reserved.
Last updated: Thu Oct 31 22:01:27 2024 UTC