hello,
I've made bad experience with up- and downloading large amouts of data:
(a) I "backuped" a database via e-mail, i.e. had sent its contents to my e-mail address:
there were about a thousand not too large records in the tables.
nonetheless, the backup script suddenly wouldn't work anymore because the memory was full. this depends on how much memory your scripts is allowed to use; the problem lies in the principle.
(b) I uploaded much table data via a php script (import script; http):
the script timed out 🙁
I had to chop the data into smaller portions; that was not funny.
perhaps it would work better with ftp functions ... has anyone a better idea?