I've searched many forums about this problem, and yet I have not found solution. So I kindly ask your help. Thanks in advance.
I'm working with a material bank software, based mainly on php and mysql. Bank is designed to store, oraginze and share any kind of digital material. It is mainly used for pictures, movies and press-related files. Files in bank are stored in separeted directory, accessible only by special php-script, which sets right headers (no-cache, content length, name of file) and passes file to browser. I think this is a very basic case, and maybe some of you have worked with simiar projects.
The problem is, that I have not found suitable method for streaming files from filesystem to browser without storing the file first in memory. As you may guess, with big, "popular" files (500-700M😎 the memory of the web server gets overloaded. I think I don't have to explain what happens when server runs out of memory. I have tried the readfile() and fpassthru() -functions, and a loop in which fread() -function reads the file piece by piece and echo outputs it to browser. I've tested different combinations, tried to restric the download speed in the mentioned loop-version, and I've come to this conclusion: php does not care about the speed the data is flowing to browser, rather it just keeps stacking it to some sort off buffer. Oh, and sure, output_buffering option in php.ini is set to off.
I've tested this with the newest 4-series php available (4.3.8), apache 2.0.48 and RedHat 9.0 as OS.
I have some other details also, but first I'd like to know if anyone here has stumbed into same problem. One solution I'm considering is to create symbolic links of the files to special directory (or directories), accessible straight without scripts. Then the download-script would only send standard HTTP-redirection header to that symbolic link. That way apache would stream the file traight from disk to browser without filling all memory. And PHP wouldn't have to be "on hold" whole download time. It would just involve more messing up with apache, taking care of cleanup of the links after download and correct filename setting for the files, but I guess apache could just do that with some hacking.
Anyway, thanks to all who had the time to read the whole message 🙂