php.net |  support |  documentation |  report a bug |  advanced search |  search howto |  statistics |  random bug |  login
Request #6746 Support for Sybase bcp and large result sets
Submitted: 2000-09-13 22:31 UTC Modified: 2004-02-01 09:52 UTC
Votes:1
Avg. Score:5.0 ± 0.0
Reproduced:1 of 1 (100.0%)
Same Version:0 (0.0%)
Same OS:0 (0.0%)
From: joschug at aol dot com Assigned:
Status: Closed Package: Feature/Change Request
PHP Version: 4.0.2 OS: Redhat 6.2
Private report: No CVE-ID: None
 [2000-09-13 22:31 UTC] joschug at aol dot com
I recently tried to do a select * from a large table (about 2 million rows, whole DB is about 500 Megs).

I noticed that PHP quickly ran out of memory after I did a sybase_query() and before accessing the result set. After a quick browse through the source of php_sybase_ct.c I saw that PHP first reads all results into an internal buffer, and after that returns each row from that buffer via sybase_fetch_array() and the like.

I'd really like to see an incremental approach here; if I run the same query against an Oracle DB (using the OCI-interface) I don't have these problems. I would like to dump and (re-)insert a whole table with PHP, but with Sybase this is currently not achievable. I tried it with server-side cursors, and it worked - but the performance dropped by factor 20 :(

I'd also like to see an interface for doing bulk inserts with bcp, maybe with an interface like sybperl for Perl implements it. This would also be very useful for large inserts (about factor 30 on my test system).

Patches

Pull Requests

History

AllCommentsChangesGit/SVN commitsRelated reports
 [2004-02-01 09:52 UTC] thekid@php.net
As of 4.3.0, sybase_unbuffered_query() with store_results = FALSE provides the necessary functionality. 

As for the BCP wishes, I don't think this fits into ext/sybase_ct, maybe there should be a different (PECL) extension for that.
 
PHP Copyright © 2001-2025 The PHP Group
All rights reserved.
Last updated: Fri Aug 29 06:00:02 2025 UTC