you could try this idea. Process the file over multiple HTTP requests of say, 5,000 records each.
This code only does one record at a time, but just add a loop to what ever size batch you want to handle.
A session variable is used to store the current file position and there is even a nice progress bar using a DIV element.
Im using a META page refresh, but you could try a SCRIPT refresh if you have problems.
<?php
session_start();
$filename = "data.txt";
$fp = fopen($filename, "r");
if ($fp)
{
fseek($fp, $_SESSION["offset"]);
if ($s = fgets($fp))
{
// Parse CSV and insert code to insert record into database.
$_SESSION["offset"] = ftell($fp);
echo "<HEAD><meta http-equiv='REFRESH' content='2'></HEAD></HEAD>";
$progress = (int)(ftell($fp) * 100 / filesize($filename));
echo "<h2>Processing Records. Please Wait...</h2>";
echo "<div style=background:#9999FF;width:$progress%>$progress%</div>";
}
else
{
echo "<h2>Processing Finished</h2>";
unset($_SESSION["offset"]);
}
}
?>
Its just an idea. Sing out it it works and report your views on the idea.