Hi all,
I am trying to make local copies of files from another server, but am having tourble with the program locking up.
I cannot FTP the files, as they don't actually exist. - They are generated by a Lotus Notes agent when you view them over HTTP.
I can open and close the files without any problem, but the freeze is happening when trying to read from the file, and even then only about 1 time in 20. I have tried using the file("filename") approach, as well as opening the file. The most reliable method seems to be opening the file and grabbing it one char at a time (not very efficient I know, but that is not an issue for this code), but it still locks from time to time.
I need the program to run on a crontab, so it has to be reliable.
The code is as follows:
echo("\nMaking local copies of ASCII files...\n\n");
foreach ($files as $news_file)
{
$filename = $url . $news_file;
$html_files= "../html_files/";
echo("Opening " . $filename . "...");
if (!$tmp_file=fopen($filename,"r"))
{
echo("Could not open" . $filename . ", retrying...\n");
return;
}
else
{
echo(" file open\n");
echo ("Reading file...\n");
while(!feof($tmp_file))
{
#This is where the code is freezing
$file .= fgetc($tmp_file) or die ("Cannot read from file");
}
echo("...Done!\n");
fclose($tmp_file);
$writename= $html_files . $news_file;
$output = fopen($writename,"w");
fputs($output,$file);
fclose($output);
echo("$writename written!\n");
}
$count++;
}
echo("Local copies made!!\n\n");
Does anyone know of a way to run a time out on the while loop? this would allow me to rerun the filegrab until successful. set_time_limit kills the script execution which is not what I want.
VERY grateful for any advice.. Thanks,
Len