I located an old function I wrote to add files/directories to a zip file. This is run on an as needed basis and even when I create large zip files it is fast as heck. You could do something like it to zip files and deliver them on the fly. Maybe from a db result, that has the file paths to all the files the person needs to download, just loop through and call it multiple times. I use this approach, combined with a download identifier, which creates a mapping to the zip file for that specific combination of files, that way if I need to create the same collection of data, I can just use an already existent file. I have a task that cleans up files that haven't been accessed in 72 hours, and removes their mapping from the database.
I don't have actual benchmarking for this function of any kind, but I haven't noticed any slow downs on the server or wait times when requesting a download. Hope this helps.
function addToZip($path,ZipArchive $zip,$reset = FALSE) {
static $root;
if( $reset ) {
$root = realpath($path) . DIRECTORY_SEPARATOR;
}
if( is_file($path) ) {
$file = str_replace($root,'',$path);
if( !$zip->addFile($path,$file) ) {
trigger_error('Failed adding '.$path.' to zip file.', E_USER_WARNING);
}
} elseif( is_dir($path) ) {
$dir = realpath($path) . DIRECTORY_SEPARATOR;
$dir = glob($dir.'*');
$dir = array_filter($dir);
usort($dir,function($a,$b) {
if( is_dir($a) && is_file($b) ) return -1;
if( is_file($a) && is_dir($b) ) return 1;
return strcasecmp($a,$b);
});
foreach( $dir as $sub ) {
addToZip($sub,$zip);
}
} else {
return FALSE;
}
}
Edit: Also please, stick to your idea of NOT using DRM. When I come across DRM protected files, I refuse to use the purveyor which I obtained the files from again.