I'm trying to add limiting and queueing to the downloads portion of a web site. What this does is updates a count of active users on each server when downloads start and stop, redirects users to the server under the least load, and enters people into a queue when all servers are full.
The problem lies in the actual downloading. The script reads the file in and sends data to the browser. Unfortunately this causes memory usage to grow at an alarming rate, so much so that if left running, it uses so much that I can't connect to the server via SSH and am forced to ask the host to restart it.
I removed the parts that update the count of downloads in progress, and the ignore_user_abort() and related code that will decrease that count even if the user cancels the download, but the problem is still there.
What am I doing wrong?
PHP 4.3.4, Apache 2.0.48, MySQL 4.0.18
if ($best['type'] == 'local') {
$file = Downloads::getFile($settings['file']); // retrieves file information
if (!is_readable($file['path'])) {
// try from current dir.
$origpath = $file['path'];
$path = $file['path'];
if ($path{0} == '/') {
$file['path'] = '.'.$file['path'];
if (!is_readable($file['path'])) {
AtomError::Error(AE_ERROR, 'Could not open "'.$origpath.'" - the file cound not be found or permission to read it was denied.');
exit;
}
}
unset($path);
}
$handle = fopen($file['path'], 'rb');
if (!$handle) {
AtomError::Error(AE_ERROR, 'Could not open '.$file['path'].'. The call to fopen() failed.');
}
// turn off output buffering
ob_end_clean();
header('Content-Disposition: attachment; filename="'.basename($file['path']).'"');
header('Content-Type: application/octet-stream');
header('Accept-Ranges: bytes');
header('Content-Length: '.filesize($file['path']));
while (!feof($handle)) {
echo fgets($handle, 256);
}
fclose($handle);
exit;
}