Greetings,
I've been coding php since about 2002 and I've never run across a problem quite like this. I'm having a weird problem in PHP5.3.6 with a script that generates a report csv file and then sends it to the browser. Seems simple enough. The full report file is only 16MB. I have 1 physical server running CentOS5-64bit. I have a staging domain where PHP 5.3.6 is running in FCGI mode under suPHP. The script works fine there. I move it to my production domain running PHP5.3.6 as an Apache module and it doesn't work. My production domain is on the same physical server and uses the exact same php install. The php.ini is the same for both sites. After much testing, it seems that the production server can send a file of up to 1MB, but anything larger just makes my browser hang and eventually time out. Here's the code I've tried using to send the file.

Attempt 1:

header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.$filename);
header('Content-Length: ' . filesize($file));
readfile($file);
exit;

This craps out with anything over 1MB.

Attempt 2:

header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.$filename);
header('Content-Length: ' . filesize($file));

//kill output buffering
ob_end_clean();
while (ob_get_level()) {
		ob_end_clean();
}
flush();
readfile($file);
exit;

Attempt 3 - reading 1 line at a time:

header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.$filename);
header('Content-Length: ' . filesize($file));

//kill output buffering
ob_end_clean();
while (ob_get_level()) {
		ob_end_clean();
}
flush();
$fp = fopen($file,'r');
while($data = fgets($fp)){
	echo $data;
}
exit;

All 3 of the above examples have no trouble with my 16MB file on the staging domain under suPHP and all 3 hang my browser and seem to do nothing with files over 1MB on my production domain running under Apache. I even went so far as to give my script 2GB of memory and the behavior is the same. Reading the file line by line allows me to send slightly more data than with readfile -- readfile seems to crap out right at 1MB, but using fgets I'm able to send a 1006kb file, but a 1007kb file doesn't work. I'm stumped. Anyone have any ideas?

    What's your max_execution_time time on your product box? It could be possible that it is timing out.

      jeepin81;11008141 wrote:

      What's your max_execution_time time on your product box? It could be possible that it is timing out.

      It's 300 seconds, but the script takes less than 30 seconds to run. Thanks for your suggestion, but I'm starting to think it might be something our web host's networking dept did to our firewall rules the other day when we suffered a DOS attack. I'm going to see if they have a clue about it.

        If you're running this locally inside the network, I don't see the Firewall being the bottleneck. Unless your production server is out of your LAN.

        Are you getting any errors and do you have all of your error reporting on?

          It was a firewall rule as I suspected. I repeatedly suggested to our web server company that it seemed like a networking problem and to make sure they'd reverted any new firewall rules related to a recent DOS attack, but apparently they overlooked a rate-limiting rule on our firewall for the past 2 days. They finally removed the rule and all is normal again.

            Write a Reply...