i have a VERY large (55mb) .tab file that i need to parse, analyze and insert into mysql.
i have the file sitting on the server, so no upload worries; however, just parsing the file hangs the server, and i haven't even tried analyzing yet
so, at this point, i'm looking for suggestions on working with this data. is it possible to split up the file into smaller bits? are there other options that i might not know about?
function importfile($fileName,$explodeDelimiter)
{
$lines = file($fileName);
foreach ($lines as $line_num => $line) {
$lineArr = explode($explodeDelimiter,$line);
foreach($lineArr as $newString_num =>$newString){
//do the analyzing and inserting
}
}
}