Greetings,
I've been coding php since about 2002 and I've never run across a problem quite like this. I'm having a weird problem in PHP5.3.6 with a script that generates a report csv file and then sends it to the browser. Seems simple enough. The full report file is only 16MB. I have 1 physical server running CentOS5-64bit. I have a staging domain where PHP 5.3.6 is running in FCGI mode under suPHP. The script works fine there. I move it to my production domain running PHP5.3.6 as an Apache module and it doesn't work. My production domain is on the same physical server and uses the exact same php install. The php.ini is the same for both sites. After much testing, it seems that the production server can send a file of up to 1MB, but anything larger just makes my browser hang and eventually time out. Here's the code I've tried using to send the file.
Attempt 1:
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.$filename);
header('Content-Length: ' . filesize($file));
readfile($file);
exit;
This craps out with anything over 1MB.
Attempt 2:
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.$filename);
header('Content-Length: ' . filesize($file));
//kill output buffering
ob_end_clean();
while (ob_get_level()) {
ob_end_clean();
}
flush();
readfile($file);
exit;
Attempt 3 - reading 1 line at a time:
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.$filename);
header('Content-Length: ' . filesize($file));
//kill output buffering
ob_end_clean();
while (ob_get_level()) {
ob_end_clean();
}
flush();
$fp = fopen($file,'r');
while($data = fgets($fp)){
echo $data;
}
exit;
All 3 of the above examples have no trouble with my 16MB file on the staging domain under suPHP and all 3 hang my browser and seem to do nothing with files over 1MB on my production domain running under Apache. I even went so far as to give my script 2GB of memory and the behavior is the same. Reading the file line by line allows me to send slightly more data than with readfile -- readfile seems to crap out right at 1MB, but using fgets I'm able to send a 1006kb file, but a 1007kb file doesn't work. I'm stumped. Anyone have any ideas?