Okay folks, I am developing a script that reads information from a variety of basic ASCII text documents. They are NOT delimited.
Some of the text files are around 50,000 bytes. Typically, the first 10000 bytes can be ignored, as well as the last 10,000 bytes. I am really only dealing with the middle of the files.
Currently, my script takes about 8-10 seconds to run on one file. I am now looking for ways to increase the speed. Here is what I am using so far.
$contents = file_get_contents("http://msnbc.com/news/summary.asp");
$contents = substr($contents, 11000);
$contents = substr($contents, 0, -9000);
/////////////////////////////////////////////////////////
$file = fopen("sometextfile", "r");
fseek($file, 10000);
while(!feof($file)) { $contents .= fread($file, 4096); }
fclose($file);
/////////////////////////////////////////////////////////
Does anyone know of a way that I can increase the speed of this script. I had thought of using something like this, to speed it up, but it didn't really change anything:
/////////////////////////////////////////////////////////
$contents = file_get_contents("sometextfile");
$contents = substr($contents, 11000);
$contents = substr($contents, 0, -9000);
/////////////////////////////////////////////////////////
I am wondering if anyone else has dealt with something similar or is even just more knowledgable about which way works best. Deal with the file using pointers and read it into a string, or read it into a string then rip the ends off.
Any help would be awesome.
Thanks in advance folks.
AZ3