|  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Bug #80061 Copying large files may have suboptimal performance
Submitted: 2020-09-04 16:29 UTC Modified: 2020-09-21 14:11 UTC
From: Assigned: cmb (profile)
Status: Closed Package: Performance problem
PHP Version: 7.4Git-2020-09-04 (Git) OS: *
Private report: No CVE-ID: None
 [2020-09-04 16:29 UTC]
As of PHP 7.4.0, large files are supposed to be copied by mapping
them into virtual memory at once.  That obviously can't work for
files which are larger than any possibly free memory segment.  So
copying files of 1 GiB or 2 GiB is not unlikely to fail for x86
architectures, for instance.  If the memory mapping fails, we fall
back to reading and writing in small chunks, which has suboptimal

See also <>.


Add a Patch

Pull Requests

Pull requests:

Add a Pull Request


AllCommentsChangesGit/SVN commitsRelated reports
 [2020-09-04 16:29 UTC]
-Assigned To: +Assigned To: cmb
 [2020-09-04 16:56 UTC]
The following pull request has been associated:

Patch Name: Fix mmap copying
On GitHub:
 [2020-09-21 14:11 UTC]
-Status: Assigned +Status: Closed
PHP Copyright © 2001-2023 The PHP Group
All rights reserved.
Last updated: Fri Sep 22 09:01:25 2023 UTC