I upload a 500MB tar.gz file to my Godaddy hosting space, I need to use php to extrace it (The support said: using scripts is the only way that files can be extracted on the server)

I am using PEAR:File_archive to extract the gz file ( http://pear.php.net/package/File_Archive )

Core code:

<?php
error_reporting(E_ALL);
set_time_limit(0);
ini_set('memory_limit', '508M');
ignore_user_abort();
require_once "File/Archive.php"; 
File_Archive::extract($src,$desc);
?>

My problem is : The extract process allways abort without any error when extracted 6000-12000 files!

I tested PEAR:Archive_Tar, and got the same result: abort with no error!

Thanks for any advice!

    Have a look at your host configuration for max execution time and the like, usually what causes silent errors like this.

      To add to Roger Ramjet's respons, insert this line at the top of your PHP script:

      ini_set('max_execution_time', 0);

        I have changed the configuration in php.ini :
        max_input_time = 3880
        max_execution_time = 3800

        Still no error output

          Change max execution time to 0. This is unlimited.

            Hi, I already add the set_time_limit(0) in my php code same as the php code in topic post!

              In fact, the extract script allways executed 50-80 seconds(exected 6000-12000 files), than abort .

                Is that 500MB before or after decompression? If before, what is the size after - and how much space have you actually got?

                  The tar.gz file is 500MB, decompression files maybe 1.5GB. My godaddy hosting space is 100GB.

                    I know I can split into more small tar.gz files to make it work, but 500MB is actually splited. 🙁

                    All my files are almost 50GB, I split it to 20 tar.gz files,uploaded and want to extract them one by one.

                      Two days gone, I cannot find a solution.

                      Now I splited them into 256 gz files, each file is amlost 55MB.

                      I uploaded 24 files to test and it seems can be extracted!

                        Use your shell account. If the provider does not provide one, dump them and find one who does.

                        Mark

                          Contact GoDaddy? Perhaps they have a max execution time enforced for PHP scripts?

                            Break up the file into tiny 100 MB chunks.

                              Thanks for any advice!

                              I had contacted the godaddy support , they donot provide shell user, and said use php is the only way to extract files.

                                In which case tell them to close your Godaddy account immediately and find a provider which doesn't suck.

                                Web space is unusable without a shell account for precisely the reason you've found.

                                Mark

                                  Write a Reply...