I need advise for my problem, i try to check big txt file (over 6mb) and insert some data from each row to database, right now i get fatal error msg "Allowed memory size of 8388608 bytes exhausted (tried to allocate 35 bytes)"
So, i was thinking that maby it's possible to somehow in the loop to open text file, read some row, close file, then insert row to database, open file again, read next row etc... but i have no idea how to do this, any ideas, can this work? (i have really big execution time limit).
Any other ideas?
This is the script wich im using right now
$db = new DB_Sql();
$db->connect();
//delete old entryes
$query = "DELETE FROM mytable";
$db->query($query);
//file class
$PATH="/somepath/bigfile.dat";
$TheFile = new file_manager($PATH);
$content = $TheFile->read();
//create array from each row
$pieces = explode("\n", $content);
$result = count ($pieces);
//loop thru array
for ($a=0; $a < $result; $a++) {
$row1= substr("$pieces[$a]",1,10);
$row2= substr("$pieces[$a]",11,6);
if (strlen($row1) == 9 && strlen($row2) == 5) {
$query = "INSERT INTO mytable (somedata1,somedata2) VALUES ('$row1','$row2')";
$db->query($query);
}
}
$db->close();
As you can see, this is not really good way working with big files...
Edit:
By the way, how do i change memory limit from apache? I cant find anything in httpd.conf file.