I have a script that parses large CSV files. It has a sprinkling of "progress messages" in it, two per parsed line. I am using output buffering and using ob_flush to flush the buffer at various times.
During the initial processing, it is very fast, importing 6700+ rows in the first 10 minutes of processing, but 4 hours and 35,000 rows later, it is down to doing about 650+ per 10 minutes.
I am making sure that I am unsetting my variables and using mysql_free_result where I actually return a result from the DB.
After the 4+ hrs Firefox is using over 200MB of memory per task mgr.
I have made mods to the script to only output anything if a flag is set but not sure if that is all I can do to make this thing run better.
I have another question, I can import the CSV using LOAD DATA INFILE and then iterate through the records to import each record but is that any faster than importing the CSV line by line?