I am building a file sharing application in PHP so that users can upload pretty large pdf files (up to 15 M😎 to a web server. It all works as it should for files up to 8MB. Anything over 8 MB, however, doesn't go up at all. In Firefox, it gives a "The document contains no data." alert a few seconds after the form is submitted. In IE, after a few seconds it goes to the address of the page that is supposed to process the file, but displays the "The page cannot be displayed" page.

Any ideas as to what might be keeping the files over 8MB from being uploaded?

I've added the standard MAX_FILE_SIZE hidden form element on the form submission page:

<input type="hidden" name="MAX_FILE_SIZE" value="16384000">

Here's the file process statement:

if(isset($_POST['upload']) && $_FILES['userfile']['size'] > 0) {
	$file_name = $_FILES['userfile']['name'];
	$tmpName = $_FILES['userfile']['tmp_name'];
	$file_size = $_FILES['userfile']['size'];
	$file_type = $_FILES['userfile']['type'];
	$fp = fopen($tmpName, 'r');
	$content = fread($fp, filesize($tmpName));
	$content = addslashes($content);
	fclose($fp);
	if(!get_magic_quotes_gpc()) { $fileName = addslashes($fileName); }

// From here I insert the file information into one DB table and the actual file into another DB table, linked to the first with a key.

... }

I've optimized my php.ini and mysql.ini settings for larger files...though I'm not sure if I haven't overlooked something here. I've never had to mess with these settings in the past. The problem arises before I attempt to insert the file into the DB, so I'm guessing the problem lies on the PHP configuration side rather than the mysql configuration size, but I'm not sure.

    With file limits, there is a master cap set in php.ini, and the value you place in your code works within this range. Basically, that means if your php.ini says 8MB is the limit, making the HTML value anything larger than that won't work (i.e. 16M😎; it would have to be equal to or less than 8MB.

    Also, you may want to check your server settings to increase the connection setting so you don't time out.

    Hiji

      try inserting this at the beginning of your script:

      ini_set("upload_max_filesize", "40MB");

        Thanks for the suggestions.

        I did change the memory limit in php.ini from 8MB to 16MB, and that stopped the "This document contains no data" firefox error I was getting. Now, the file seems to upload but chokes, I think, on this part of the script:

        $content = fread($fp, filesize($tmpName)); // line 34
        $content = addslashes($content); // line 35 

        So uploading a file larger than 8MB now produces the following error:

        Fatal error: Allowed memory size of 25165824 bytes exhausted (tried to allocate 18329807 bytes) in /var/www/livesite/fileshare/file_process.php on line 35

        In my version of PHP -- 4.1.2 -- the memory_limit value that sets how much memory the processing of a script can consume can't be higher than 24MB. Mine is set to 24M...thus the "allowed memory size of 25165824 in the error message above.

        So now I have two questions. The first is: why is the server reaching the allowed memory size of 24M when it is only trying to allocate roughly 18M? Is this a sign of a "memory leak" in the code?

        The second is: why does the server need to allocate 18M when the file being uploaded is, in this case, only 8.25M in size?

        Note also that the error message says it is choking at line 35 of the script: $content = addslashes($content). Am I right in thinking that the more than doubling in size of the allocated memory happens because the data in $content is being read in effect twice (i.e., the second time through the addslashes() command?

        If I comment that line out, it will upload the large file to the server and doesn't choke...but it can't add the file to the database anymore either using this bit of script:

        // INSERT INTO STORED TABLE
        $query = "INSERT INTO fs_stored (s_thefile, s_thetype) VALUES ('$content', '$file_type')";
        if(mysql_query($query)) // ....

        If this is the case, that it is reading the file twice, then I think I am sunk because the script will consume twice the amount of memory than I can allocate to it.

        Any ideas?

          my guess is that it is reading the file twice into memory, once into the $_FILES superglobal and once with your fread command.

          out of interest, what are you trying to do with the fread bit? can you not just do this with the $_FILES superglobal since you already have the file in memory?
          if not, consider redirecting to a separate script to cure the memory issue.

          not sure of the exact ins and outs, but as a habit i never store (large) files in a database, but instead store them outside the web tree and use PHP to control access to them.

            liquorvicar wrote:

            my guess is that it is reading the file twice into memory, once into the $_FILES superglobal and once with your fread command.

            out of interest, what are you trying to do with the fread bit? can you not just do this with the $_FILES superglobal since you already have the file in memory?
            if not, consider redirecting to a separate script to cure the memory issue.

            Good question! I'm not experienced enough with this kind of operation, so I've had to borrow code for things like saving a file to a server and then reading it into a database. This fread routine was recommended to me and it worked until I got to testing with larger file sizes.

            The long and short of it is that I'm not sure I'd know how to get the uploaded file out of the $FILES superglobal in another way in order to read it into the db. Am I obliged to use the fopen() and fclose() bit?

            $fp = fopen($tmpName, 'r');
            $content = fread($fp, filesize($tmpName));
            $content = addslashes($content);
            fclose($fp);
            

            Or is there another way of pulling the content out of $_FILES?

            not sure of the exact ins and outs, but as a habit i never store (large) files in a database, but instead store them outside the web tree and use PHP to control access to them.

            This is a long story, the upshot of which is that we are obliged to store the files in the DB rather than on the server.

              re-reading the docs, i don't think there is a way of getting the raw data from the $_FILES superglobal. you're better off uploading the file in one script, and if that works OK and it all verifies, then redirect using:

              header('Location: xxx.php?file'.$uploaded_file_name);

              to another script to read the file into memory and stick it into the database.

              that way each script will only consume the memory equivalent to the file's size.

              alternatively, you may be able to unset() the $_POST data and do it all in the same script.

                The file itself doesn't appear in the $_FILES array, only filenames are stored there; the files themselves are stored in the filesystem. (Generally speaking, it's better to store files in the filesystem and store something to locate them with in an array or database. Benefits include much faster access, much lower memory consumption, and less hassle with encoding arbitrary binary data for storage in different formats.)

                Like the error message says, the blowout occurs on this line:

                $content = addslashes($content); // line 35 

                For a moment there are two copies of the file - one with slashes added, one without. For an 18MB file, that's 36MB right there, even without considering the additional space needed for all those slashes.

                  25 days later

                  what about apache time out ?

                    Write a Reply...