Hi there.
Here's my code, I'm sure I'm doing something wrong in the fread loop but I don't know what.
I have two problems: 1) This takes a really long time top copy large files (taking several minutes to copy a few hundred M and 2) It takes a long time to copy small files too, but most of that is spent re-reading the lsat chunk of the file (as far as I can tell)
What I'm trying to do is make a script that will (eventually, when I can get the thing to work!) be used to backup files from one location to two others at the same time. So that if one drive fails, then I still have a backup on the other destination. I made it clever enough that you just (right now) direct the browser to file.php?src=SOURCE&dest=DEST1&dest=DEST2 and so on, theoretically ad infinitum.
Do you think that I'll ever get this to run on larger files (I've had problems with larger files, >4GB, where only a certain amount of data gets copied and I'm left with a corrupted file in my backup.) and be faster than just execing cp SOURCE DEST1;cp SOURCE DEST2?
I apologise if that made less sense than you would like
<?php
// set_time_limit (0); // Stop the script from timing out.
set_time_limit (1); // Make the script time out sooner.
$startTime = time();
$filesize = 0;
$arg = explode ('&', substr ($_SERVER['REQUEST_URI'], strpos ($_SERVER['REQUEST_URI'], '?')));
$_GET['src'] = stripslashes ($_GET['src']);
$dst = array();
$chunkNum = 1;
foreach ($arg as $thisArg)
{
if (substr ($thisArg, 0, strpos ($thisArg, '=')) == 'dest')
{
$dst[count($dst)] = rawurldecode (substr ($thisArg, (strpos ($thisArg, '=') + 1)));
}
}
if ((isset ($_GET['src'])) && (count ($dst) > 0) && (file_exists ($_GET['src'])))
{
if ($sourceFile = fopen ($_GET['src'], 'rb'))
{
// Source file is accessible.
echo 'Reading source!'."\n\n";
flush();
foreach ($dst as $thisDest)
{
$destFile[count ($destFile)] = fopen ($thisDest, 'wb');
}
// If a destFile was not open-able, then it should be silently ignored and the script skips to the next one...
while (($buffer = fread($sourceFile, 4096)) !== false)
{
echo 'Processing 4kb chunk '.$chunkNum."\n";
flush();
$chunkNum++;
foreach ($destFile as $thisDest)
{
fwrite ($thisDest, $buffer);
$filesize += strlen ($buffer);
}
/* It seems to freeze here... The files are copied to the new location, and can be opened and viewed, but it
never releases the lock on the file, and never echos 'Done Copying!' In addition, it seems to be giving
me many more chunks than it should be... A 2 MB image file copies fine, but with the timeout set to one
second, it copies the file then reads another 7 MB worth of chunks before being timed out then stopped. */
}
echo "\n\n".'Done copying!';
flush();
fclose ($sourceFile);
foreach ($destFile as $thisDest)
{
fclose ($thisDest);
}
echo 'The file '.$_GET['src'].' was successfully copied to '.count ($destFile).' locations.<br />';
echo $filesize.' bytes (about '.round ($filesize / 1048576).'MB) were copied in '.(time() - $startTime).'seconds. (About '.round((time() - $startTime) / 60).' minutes.)';
flush();
}
else
{
echo 'Error: Source file specified ('.$_GET['src'].') could not be accessed by the web server process!';
}
}
elseif (!isset ($_GET['src']))
{
echo 'Error: No source file specified! Specify one with &src=<file> in your address bar.';
}
elseif (count ($dst) == 0)
{
echo 'Error: No destination file(s) specified!';
}
elseif (!file_exists ($_GET['src']))
{
echo 'Error: Source file specified ('.$_GET['src'].') does not exist!';
}
?>