Hi

I'm trying to get some advice on writing a script that would ftp into multiple remote servers simultaneously and download from each server at the same time.

Ideally I would specify an array of IP Addresses to be connected to and the script would connect to all at the same time.

Once connected download all files and then (optionally) remove the files from the remote servers.

It's the simultaneous connections / downloads that are stumping me !

This is to be used on a WAN to back up specific files.
Thanks

    Is there any reason why you can't have an instance of a script run for each ftp connection you need to make?

    Also, how are you running this? As a script run from Apache, or over the CLI?

      Why does it need to be simultaneous why can't you just handle one connection at a time like so:

      // Set of FTP servers to connect to
      $servers = array(
         array( 'host' => '12.34.56.78', 'user' => 'someuser', 'pass' => 'passwordhere' ),
         array( 'host' => '123.234.111.1', 'user' => 'someuser', 'pass' => 'passwordhere' ),
         array( 'host' => '87.26.1.2', 'user' => 'someuser', 'pass' => 'passwordhere' ),
      );
      
      foreach( $servers as $server ) {
         $con = ftp_connect($server['host']);
         ftp_login($con,$server['user'],$server['pass']);
         ftp_get($con,'/path/to/remote.file','/path/to/local.file',FTP_BINARY);
         ftp_close($con);
      }

        Hi
        Thanks for the very quick replies.

        The plan is to run this over night from a crontab.

        Each remote site has very limited bandwidth.
        If I start at site 1, then site 2 etc, I will run out of time be fore I get to site 20.

        If I can start 1 to 20 simultaneously then we should be OK. Bandwidth on the main site is fine, it's just the remote sites that are slow !

          I'd just create one script but run several instances of it from your cron.

            That's an option I hadn't thought of !!

            However, I would need to pass IP Addresses to the script.
            And there could be in excess of 50 running !!

              If I created one script, ftp.php, can I pass IP addresses to it ?

              eg:
              ftp.php?site=192.168.0.1

              Could I then create another script that just did includes ?

              <?php 
              include("ftp.php?site=192.168.0.1"); 
              include("ftp.php?site=192.168.0.2"); 
              include("ftp.php?site=192.168.0.3"); 
              include("ftp.php?site=192.168.0.4"); 
              include("ftp.php?site=192.168.0.5"); 
              ?>

              etc

              And call that new script from cron ? Would that work ?

                Yes and no. If you include a file the time it takes to do what that file says to do will pass BEFORE it moves on to the next file. Also you can't put a query string in an include, because after all its not an HTTP request but rather a file system request.

                You could use system to make the calls to run the additional files. Something like this (my command line skills are fail, so it may be completely wrong):

                system('/usr/bin/php /path/to/file.php --ip=192.168.0.1 &> /dev/null');

                  You could also just store all of the IPs in a text file, one per line, and then pass in the path to the text file as well as the line number to be used when executing your script. Taking Derokorian's example above, you'd then have calls that look more like:

                  system('/usr/bin/php /path/to/file.php /path/to/ip_list.txt 0 &> /dev/null');
                  system('/usr/bin/php /path/to/file.php /path/to/ip_list.txt 1 &> /dev/null'); 
                  system('/usr/bin/php /path/to/file.php /path/to/ip_list.txt 2 &> /dev/null');
                  // etc.

                  Of course, at that point you're basically emulating a fork(). Thus, I personally would do a similar approach but instead use [man]pcntl_fork/man to spawn multiple child processes that do the actual FTP'ing. The parent process (e.g. the one that spawned all of those child processes) could then use [man]pcntl_wait/man to wait until all of them have finished their work, at which point you could do some sort of status reporting if desired (e.g. check if any of them failed, record relevant statistics in a log file, etc. etc.).

                    This being a cron job, it's not necessary to write the launch script in PHP; a shell script that loops through the IPs in a text file and launches a background PHP instance for each could also work.

                      Thanks, I'll try this and see how I get on !

                        Write a Reply...