Hi, I'm using a simple image resize script from phptoys.com to create a batch of thumbnails... I keep getting the error:

Fatal error: Maximum execution time of 30 seconds exceeded in /blahblahblah/resizeimage.php on line 43

Here's the code I'm calling in that gives the error:

function resizeImage($originalImage,$toWidth,$toHeight){

// Get the original geometry and calculate scales
list($width, $height) = getimagesize($originalImage);
$xscale=$width/$toWidth;
$yscale=$height/$toHeight;

// Recalculate new size with default ratio
if ($yscale>$xscale){
    $new_width = round($width * (1/$yscale));
    $new_height = round($height * (1/$yscale));
}
else {
    $new_width = round($width * (1/$xscale));
    $new_height = round($height * (1/$xscale));
}

// Resize the original image
$imageResized = imagecreatetruecolor($new_width, $new_height);
$imageTmp     = imagecreatefromjpeg ($originalImage);
imagecopyresampled($imageResized, $imageTmp, 0, 0, 0, 0, $new_width, $new_height, $width, $height);

return $imageResized;
}

here's my code that calls it in:

echo 'looping through photos<br / >';
while (($file = $images->read()) !== false)
{
//name image file
$imgpath=$albumf.'/slides/'.$file;
//name thumbnail file
$thumbpath=$albumf.'/thumbs/'.$file;
//avoid .htaccess, subfolders etc
if (($file[0] !== '.') && (is_file($imgpath))) {
list($width, $height) = getimagesize($imgpath); 
//calculate thumb size
$thumbw= round($width/$height*$thumbh);

echo 'creating thumbnail for '.$file.'';
	//Resize the image
    $newthumb = resizeImage($imgpath,$thumbw,$thumbh);
    // Create the thumbnail file
    imagejpeg($newthumb,$thumbpath,100);
	// leave permissions open on thumbnail
	chmod ($thumbpath, 0777);
echo ' and the xml code<br / >';
//a whole load of other stuff irrelevant to this problem!
}}

this is part of a loop that goes through a folder of jpegs, creating thumbnails in a separate folder and doing some other stuff. Any ideas what's causing this timeout? It only happens on the live server - my server at home handles it without a blip.
It's always dying on a different image, so it's not that there's a dodgy image in there.

    want to know from where $images variable comes from. Do u thnk there could be any image with huge file size more than 1-2 MBs? Also, pls put code in [P H P] quotes.

      The test folder of images doesn't have anything bigger than 130k, which shouldn't be an issue, shirley? There might be bigger images sometimes though - it's a script to automatically generate the stuff needed for flash slideshows of peoples wedding photos. Anything I can do to prepare for this eventuality?

      as for where $images comes from, it's here:

      if (!file_exists($albumf.'/thumbs')) {
      mkdir($albumf.'/thumbs');
      chmod ($albumf.'/thumbs', 0777);
      }
      echo 'reading photos folder<br / >';
      $images = dir($albumf.'/slides');

      (used the proper tags this time - sorry about previously)
      the chmods are there because everything was generated with awkward permissions which upset the flash app.

        didn't find anything suspicious. r u getting the thumbnails at least for some images?

          I'd guess either timeout issues or memory limit issues. And yes, 130k can definitely be a problem. If you for example create a 1500x1500 image containing only one colour and save it as jpeg, you will get an image size < 200k. But if you save it as a 24 bit bmp the file size will be close to 7MB, and this is roughly the size gd has to deal with.
          If you have a memory limit set lower than say 32MB, you should probably up it. And if you don't, then you need to check that you really do release resources properly or you will use up memory in now time.

          If memory doesn't seem to be a problem, then check how long it takes to create 10 thumbnails to get an estimate on average time, then check how many images you have to process per slides directory...

            Looking into it a bit more, some of the other images its got to use can be around 250k. These are all normal photographs of about 800x550 pixel size.

            Also, I really should have mentioned this before, the script has upwards of 500 images to deal with at a time. Is this the cause of the timeouts? I was assuming that it was on a per image basis, but maybe it's expecting to do the lot in 30 seconds...

            I don't really know how to release resources in this script. I'm quite new to file handling in general... Is there something I should do after each image is generated and saved to clear the way for the next one? I tried unsetting the $thumbpath variable (and a few others) but that didn't seem to make a difference.

            As to changing memory limits, I'm not sure the host will let me do that, but I'll see...

            The error is now coming up on my local computer too. I think it's because I changed the folder of images after accidentally deleting the previous one...At least I can now test my fixes without having to upload every time!

              mistafeesh;10963516 wrote:

              Looking into it a bit more, some of the other images its got to use can be around 250k. These are all normal photographs of about 800x550 pixel size.

              Iirc, gd needs pixels 3 + small_overhead bytes for an image. So in this case roughly, 800 550 * 3 or about 1.26MB.

              mistafeesh;10963516 wrote:

              Also, I really should have mentioned this before, the script has upwards of 500 images to deal with at a time. Is this the cause of the timeouts? I was assuming that it was on a per image basis, but maybe it's expecting to do the lot in 30 seconds...

              Timeout is total execution time for the request. This is most likely where things go wrong, so up the time limit. You don't have to change the php.ini setting if this is the only script requiring more time: http://se.php.net/manual/en/function.set-time-limit.php

              mistafeesh;10963516 wrote:

              I don't really know how to release resources in this script. I'm quite new to file handling in general... Is there something I should do after each image is generated and saved to clear the way for the next one? I tried unsetting the $thumbpath variable (and a few others) but that didn't seem to make a difference.

              I'd guess that as long as you do something like this

              while (nextFile()) {
              	$im = imagecreatetruecolor();
              	...
              }

              PHP will take care of releasing the old image resource when $im loses its reference to it. But if you for example append new image resources to an array, then they will continue to take up memory until the script ends.
              To manually handle resource cleanup for gd images, use imagedestroy

              mistafeesh;10963516 wrote:

              As to changing memory limits, I'm not sure the host will let me do that, but I'll see...

              It probably won't be needed from what you've said so far.

              Also, no matter how high you set your time limit, there is of course always a risk that a folder contains too many images to be handled in that time. As such it might be a good idea to keep track of how long the script has been running ( using time() would suffice here, else microtime() ) and stop the conversion process when you have less than 5 seconds left to the max execution time.
              If you abort conversion prematurely, inform the user (or whomever is initiating the task) and ask them to click continue. You'd also need to keep track of which images have thumbnails and which don't, but I'm assuming you somehow allready have the means to do that.

                thanks for that johanafm - that's a lot of really useful stuff for me to get my teeth into.
                I'll go through and make changes to my code based on what you've said here, then I'll let you know how it's going...

                  Btw, two other ways to deal with this is to either create a thumbnail directly on image upload, or to create thumbnails when they are needed.
                  Creating thumbs on upload means you don't risk too long execution times.
                  Creating thumbs on image requests might lead you into the same problem unless you show too many thumbs per page and none have previously been shown. Using pagination and only showing 10-50 thumbs at a time ought to do the trick though.

                  And should you decide to go either way, you could still run your old thumb creation script from the command line, which has no time limit. But that assumes you can connect to the server via ssh and run php through CLI.

                    Thanks again.

                    How do you mean creating them on upload? I'm uploading them then running this script. Am I missing something?!
                    I'm building this for a friend who wouldn't be able to use the CLI without a lot of help, and I'm kind of hoping this is the last bit of help for him... Don't do work for your friends kids!
                    I can't have them created on request as it's a closed source flash gallery that's using them, and I think it'd get upset if they're not all there...

                    Is there a way to get the server to time each individual image operation rather than upping the overall time limit and timing the whole lot? I'd like the script to be able to work no matter how many images it gets thrown its way...

                      mistafeesh;10963531 wrote:

                      How do you mean creating them on upload? I'm uploading them then running this script. Am I missing something?!

                      Ah, I was assuming the user could only upload a small amount of images at a time, but then I guess you mean they can actually upload 500+ images in one go.

                      You could of course use set_time_limit(0) to allow infinite script execution time... but the browser will stay in "loading page" state until everything is done, so it's not really a good option.

                      mistafeesh;10963531 wrote:

                      I can't have them created on request as it's a closed source flash gallery that's using them, and I think it'd get upset if they're not all there...

                      Yes, this might be impossible for you, but it really depends on how the flash gallery is built.

                      mistafeesh;10963531 wrote:

                      Is there a way to get the server to time each individual image operation rather than upping the overall time limit and timing the whole lot? I'd like the script to be able to work no matter how many images it gets thrown its way...

                      Perhaps you should have a lookt at the process control functions, and in particular this comment. This way, you could let a child process go on "forever" creating thumbs, while exiting the parent process after it has displayed the page to the user.
                      However, this also means that the user might try to view the gallery before all thumbs have been converted.
                      But, if the flash gallery looks for thumbs, and only deals with images when a thumb is clicked, there will be no problem. If the gallery on the other hand looks for images and then assume that the corresponding thumbs are there, you will have problems.

                      To deal with such a problem, if it exists, comparing count of glob or scandir for your image directory and its corresponding thumb directory is one way to check if the child process has finished. But, if even one image fails conversion, this comparison would also fail. Having the forked process somehow indicate conversion completion is probably better: create a file in thumbs or image directory called .thumb_conversion_done is one way, updating a db is another (and if you ever start a new conversion process for the same image directory again, this file/db value has to be removed/changed). If you are uploading files through a php script, you'd do this as soon as new files are uploaded.

                        Right then... I've put an imagedestroy after each image file is written and increased the time limit to a few minutes. It's all working fine, although I will change it to using pcntl_fork as soon as I can get my head around it! I don't like the idea of it breaking due to a browser interruption.
                        I've learned quite a bit through this one!

                        Thanks so much for your help johanafm - if you're ever in Cornwall(UK) I'll buy you a pint!

                        Dan

                          Write a Reply...