Hi All,

I'm using get_file_contents() to extract arrays fron a text file and populate a database with them It works fine on quite large files, but when they get very large (like 145mb) it simply doesn't work. Does anyone know if there's a limit to .txt file sizes?? or even to the size of arrays from lines (there are about 100000!)

Here's the code:

$exportfile = file_get_contents("sql_output.txt");

$lines = explode("\n",$exportfile);

// This removes any empty entries from the lines array:
foreach($lines as $key => $value)
{
  if($value == "") {
    unset($lines[$key]);
  }
}
$lines = array_values($lines);

Any thoughts anyone?

Many Thanks

Silas

    I am not sure if php.ini has any restrictions regarding this (perhaps one of hte pro's could expound) but how big of a file you can read into memory is highly system dependent. So basically you cannot read anything bigger than your RAM (note that swap is not included here). I.e. your OS, all the "ever-present" processes, plus the php parser all have to be in memory (for the most part) - and you can only get space after that.

    That being said, however, I don't feel you should have problems reading in a 150 MB file if you even have a 1 GB RAM, provided you dont have a whole bunch of other stuff going on. But you are also storing it in an internal array - i.e. using more than twice the space - you have to keep that in mind.

    My $0.02.....

      Hmmm... Thanks for the thoughts. In fact, as it's running on a server of mine at the moment I should be a little careful with sizes of files once it gets uploaded to the ISP's servers... Maybe better to restrict the file size anyway.

      Thanks again!

        silasslack wrote:
        $exportfile = file_get_contents("sql_output.txt");
        
        $lines = explode("\n",$exportfile);

        At this point the entire line will be in memory twice - once as a single string, and once as an array of strings. [man]file[/man] will reduce one step, but if memory consumption really is a concern there's

        $fp = fopen('sql_output.txt', 'rb');
        while(!feof($fp))
        {
            $line = trim(fgets($fp));
            if($line=='') continue;
            // Do whatever it is you do with the line.
        }
        fclose($fp);
        

        Then you only ever have one line of the file in memory at a time.

        Another possibility, depending on the format of the text file (SQL?) is to have the database itself read the file directly and not get PHP involved at all.

          You could try to disable max_execution_time limit.
          This is by default 30 seconds.
          You can see what value max_execution_time has
          by using phpinfo();

          Using this in beginning of your script
          will set it to unlimited time.
          And so script will not time out by loading very large files.

          set_time_limit( 0 ); // set max execution time to unlimited

          Or you can set it to for example 300 seconds = 5 minutes.

            Like a fool I had the php memory limit set to the default of 8mb! I've changed it to 800mb and the problem is instantly solved!

            I feel like a proper duffer now!

            Thanks for your help everyone

            Silas

              Write a Reply...