I have ten ~200mb .txt files in a folder. I'm pretty sure each is tab delimited (not csv). The plan is to loop through each file, then each line in each file and load it into a MySQL database. In the past, I've used fread($file, filesize($filename)) but this doesn't seem to work here because the files are too large to load all at once. I'm trying to loop through in 1,024 byte (or kb?) chunks. I'm worried that this would miss some data if it ends in the middle of a record. Right now I can't seem to make any use of $data, although when I run the following query, it will return the filename, 22 and, "finished" so it is doing something.
Thanks for any help.
<?php
//loop through each file in directory
// Open the folder
$directory = opendir('./data') or die("Unable to open path");
// Loop through the files
//while (false !== ($filename = readdir($directory))) {
$filename = readdir($directory);
if($filename == "." || $filename == "..") continue;
echo $filename . '<br />';
//loop through each line in file
$i = 0;
$file = fopen('http://pathtofile' . $filename, "r");
while (!feof($file)) {
$data = fread($file, 1024);
$data_array = explode("\n", $data);
foreach ($data_array as $row) {
$record = explode("\t", $row);
echo $record[0] . '<br />';
}
$i++;
} //loop through file
echo $i;
fclose($file);
echo $i . '<br />';
//$files++;
//import into database
//} //loop through the files
echo 'Finished';
?>