so, i have a text file that i must upload (no problem!), open and parse / sub-parse each line, and either update or insert each line into mysql. this text file (weighing in at over 2mb) will have 30,000 records on it.
right now, i'm only at the point where i upload, open and parse and echo the string result out and THAT hangs up my browser.
so. my question -> what would you suggest be the fastest, most efficient way to code this. i'm trying to prevent the whole browser hangup thing.
function importfile($fileName,$explodeDelimiter)
{
$lines = file($fileName);
foreach ($lines as $line_num => $line) {
$lineArr = explode($explodeDelimiter,$line);
//this section of 'echo' is only for testing. will be replaced with mysql updates and inserts
echo "Line #<b>{$line_num}</b><br />";
foreach($lineArr as $newString_num =>$newString){
//so right here, i would do a mysql search to determine if record exists and, if so, update the record; if not, insert new
echo "[".$newString_num."] ".$newString."<br />";
}
echo "<br />";
}
echo "<br /><br />";
}
again, i can make this work, but i'll have to increase the script timeout and who knows what else. not a problem except i'm going to do these updates three times a day. so, really, i'm looking for opinions / suggestions on optimizing the process.