I'm using file_get_contents to open a URL. But, I basically only need the first 20 lines of the code from the URL, but the URL splits out at least 300 lines.

Is there anyway to limit file_get_contents by size? This is so I won't have to download a 1 mb file.

    file_get_contents($url,100000) would be ~100 kbs?

      Well, if you'd rather limit how much data you put into a string from a file by size, why not use....

       
      $file = fopen("file.txt",  "r");
                $file_contents = fgets($file, 5000);  //Limits to 5000 bytes
      fclose($file);
      

        basically, the URL is about 1MB to download the entire file.

        I would just like to download the first 100kb of it using file_get_contents to reduce bandwidth and processing time.

          The function file_get_contents() loads all the contents of a file into a string.

          Use:

           
          $file = fopen("file.txt",  "r"); 
                    $file_contents = fgets($file, 100000);  //Limits to 100KB 
          fclose($file); 
          
          

          $file_contents will have 100KB worth of data from the top of the specified file (file.txt in this case).

            gotcha, the entire contents of the 1mb file is still loaded right? So, bandwidth dl with still be 1 mb.

              With file_get_contents, yes, with the code I posted, only 100KB.

                slick, but it seems that fgets is very slow compared to file_contents_open

                  Write a Reply...