Would love some help to try and figure out what is causing this problem:
Description_______________________
I have a script that parses a log file and builds a report. When the log file is around 200 lines, it works perfectly. When the log file is around 2,000 lines, it gets about 60% of the way through and then abruptly quits.
However, it doesn't stop. It re-grabs the php script and starts all over again from the beginning. Then it gets about 60% done and quits - only to re-grab the php script and start again.
I'm assuming it's some kind of memory issue, so I upped the memory_limit to 16M, but that didn't help at all. Also, if it's some kind of memory fault, wouldn't it kill the process? I don't understand why it continuously grabs the script and re-runs it.
More detail_________________
The script works this way: It allows you to upload a log file or run a report from a log you've already uploaded.
Then it reads that file into an array using file().
I then parse through every line in that file array using a for loop. With each pass through, the totals are incremented (for the summary report) and a string var is grown containing details of each transaction - which is simply appended at the end before it's presented.
There are two db calls for each line of the log file.
I outputted to the error log a simple message that said "made it to line $x of $total" so I could see where the problem was occurring. The first time it quit, it made it to line 1343, then 1372, then 1353 - no real rhyme or reason as far as I can tell.
Thanks for any help or suggestions.