yeah actually at first when i wrote it, i used fgetc reading one character at a time checking to see if it was a newline (\n) and if so increment the line variable, thats why i thought it may run slow, then i remembered fgets would read 1024 bytes or an entire line, which ever came first, so i edited the post. pretty much this should run faster than using file since it doesnt store the contents in any variable, and it just moves the file pointer to the desired line, that way once the pointer is at line 120, you can continue read as many or as little lines as you want from there.
so what you want to do is...
//...
lineseek($fp, 120);
$lines = array();
for ($i = 0; $i < 20; ++$i) {
$lines[] = trim(fgets($fp)); //read one line, trim \n, push it to lines array
}
print_r($lines); //should print out lines 120-140
edit:::
tested it on a 60mb video file with 147558 lines. seeking to line 56500 always took 0.11 seconds, with variations in the decimal beyond that point, and seeking to ilne 100,000 took an average of 0.19 seconds. I didnt check the memory usage, but it shouldn't be much different than execution of a "normal" php script.