I am writing a PHP program for general distribution. I would like to have a utility to backup data from only the tables used by my program. Keep in mind that users may have several programs running from a single database (which is why I prefix all table names with a string that resembles the name of the program).
Would it be reliable to backup several tables of tens of thousands of records to a single file? Approximately how long should it take to write that many lines of data to a file, and would a single SELECT * statement per table work, or would I be required to LIMIT the SELECT to a certain number of records to avoid overloading anything?
Remember, compatibility is the key, so if an average server can only handle a SELECT of 10,000 records at a time, I would probably want to LIMIT it...
Thanks!