indexes will slow down adding data. They speed up select statements.
I have manually loaded millions of records into a table in one run, so it's kind of expected when you are populating a table.
Even with php.ini tweaking I was running into the same problem. What I was able to do was process one results, then recall the script using javascript with get vars telling the next run where to start from. This allows me to call the script 45million times at 1 second a process, instead of calling it one time with 45million seconds to process.
//at top of script
if(isset($_GET['process_count'])){
$process_count = mysql_real_escape_string($_GET['process_count']);
if(isset($_GET['next_record'])){
$next_record = mysql_real_escape_string($_GET['next_record']);
}else{
$next_record = 1;
}
}else{
if(isset($_GET['next_record'])){
$next_record = mysql_real_escape_string($_GET['next_record']);
}else{
$next_record = 1;
}
$process_count = 1;
}
if($process_count < $next_record){
$start_record = $next_record - $process_count;
}else{
$start_record = $next_record;
}
...
//sql statmennt
$sql = "Select * From table WHERE field=value limit $start_record, $process_count "
...
//at bottom of script
echo "<script>window.location ='http://localhost/DataBuilders/extract_all.php?next_record=".$next_record."&sleep=".$sleep."&process_count=".$process_count."&debug=".$debug."';</script>";
Obviously only works if your source data is already in a table. In my case with the 45mil records to load, I broke the one csv sheet into smaller ones, loaded those into a database, then ran the above script to select a group of rows to process and add to their normalized tables.