I'm trying to figure out a (quicker and easier) way of importing 100K, 250K, 500K, even up to 1M records into MySQL from a text file. Two problems:
Getting the text file to the server. A text file with 500K records is about 14MB. mysqlimport has local and compress options but I haven't played around with those yet. Also considering just letting the user ftp the file (gzipped presumably) and having some sort of web interface to select the file on the server for import.
The import itself. I'm using 'load data infile ...'. Importing 500K records takes around two minutes - the table has two indexes.