Greetings i am trying to zip up many images into a folder that are being brought down from amazon s3 however it seems to always time out at between 20 and 44 pictures.

There should be 47 images in total, though in the future it may be even more. I am pretty sure i am running out of ram, the VPS has 256MB ram.

I have tried to refactor the code to be more and more efficient but i am now running out of ideas.

The file seems to return a server error by just trying to download the images too. So it may be a problem with that side of things.

        $amazon_params = AmazonS3::first();
        $amazonS3 = new s3;
        $amazonS3::setAuth($amazon_params->accesskey, $amazon_params->secretkey);

    $gallery = Gallery::find(array('conditions' => array("slug = ?", $this->_request->getParam('slug'))));        
    $photos = Image::all(array('conditions' => array("gallery_id = ? ORDER BY title Limit 0,10", $gallery->id)));

    if ( $amazonS3::getObjectInfo($amazon_params->bucket, $amazon_params->image_s3.'/zip/'.$gallery->slug.'.zip', true) === false) 
    {   
        $path = dirname(dirname(dirname(dirname(dirname(__FILE__))))). "/tmp/zip/";
        if (!$path . $gallery->slug) {
            mkdir($path . $gallery->slug, 0777);
        } 

        $i = 0;            
        // New Zip Archive
        $zip = new ZipArchive();
        $zip->open($path.$gallery->slug.'.zip', $overwrite = false ? ZIPARCHIVE::OVERWRITE : ZIPARCHIVE::CREATE);

        foreach ($photos as $photo) {
              $i++;
              $file = $path . "/" . $gallery->slug . "/" . $i . ".jpg";
              $amazonS3::getObject($amazon_params->bucket, $amazon_params->image_s3 . $photo->file, $file);
                  $zip->addFile($file,basename($file));     
        }

        $zip->close();

    What's the error message in your webserver's error logs? A 500 or "internal server error" is going to leave behind some trace in the error logs... unless you don't have any, of course.

    Also, if PHP's timing out (max execution time, etc.), same goes for the PHP error log. So... do you have log_errors set to On and error_reporting set to E_ALL? If so, have you checked the PHP error log for relevant errors?

      So far my log is reporting

      [02-Aug-2011 23:39:48] PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib64/php/modules/php_gd.so' - /usr/lib64/php/modules/php_gd.so: cannot open shared object file: No such file or directory in Unknown on line 0

      I have now installed this .so file so this error is silenced but i am still getting 500 errors with my apache log saying "Premature end of script headers in index.php"

        What are the values of the PHP directives error_reporting, log_errors, and error_log as reported by [man]phpinfo/man?

          PHP Info for errors

          log_errors On On
          log_errors_max_len 1024
          max_input_time 60
          memory_limit 128M
          max_execution_time 0

          error_reporting 22519 no value
          log_errors On On
          error_log /var/log/php-scripts.log /var/log/php-scripts.log

            i tried with 10 images and seen how much memory it was using per image just downloading.

            Home Photos Updates Models Videos News Members Join
            Using 8388608 bytes of ram. 
            Using 8388608 bytes of ram. 
            Using 8388608 bytes of ram. 
            Using 8388608 bytes of ram. 
            Using 8388608 bytes of ram. 
            Using 8388608 bytes of ram. 
            Using 8388608 bytes of ram. 
            Using 8388608 bytes of ram. 
            Using 8388608 bytes of ram. 
            Using 8388608 bytes of ram. 
            Using 8388608 bytes of ram.
            

            using

                        $i = 0;            
            foreach ($photos as $photo) { $i++; $file = $path . "/" . $gallery->slug . "/" . $i . ".jpg"; $fp = fopen($file, "wb"); $object = $amazonS3::getObject($amazon_params->bucket, $amazon_params->image_s3 . $photo->file, $fp); echo "Using ", memory_get_peak_usage(1), " bytes of ram. <br />"; }
              Write a Reply...