The fields are small: name, city, state, zip, email.
The way I tried it, I open the file and read it line by line using fgets:
$f = fopen($file, "r");
while (!feof($f)) {
$buffer = fgets($f, 1092); // I've tried this with various buffer sizes
Then I split the line into its parts as follows:
list($name,$city,...) = split($buffer);
Once I have the variables created, I have to insert them into one database table, then get the ID assigned to that record by MySQL and insert that into another table.
So with each iteration, the script has to:
1- split the line into its parts
2- insert into table 1
3- pull out id (using LAST_INSERT_ID) from table 1
4- insert into table 2.
Any ideas?
Oh, the database connection and db selection are done OUTSIDE the loop (thankfully, I'm not that dumb, although I'm not sure how much that would add to overhead with PHP's use of persistent connections).