I'm trying to figure out a (quicker and easier) way of importing 100K, 250K, 500K, even up to 1M records into MySQL from a text file. Two problems:

  • Getting the text file to the server. A text file with 500K records is about 14MB. mysqlimport has local and compress options but I haven't played around with those yet. Also considering just letting the user ftp the file (gzipped presumably) and having some sort of web interface to select the file on the server for import.

  • The import itself. I'm using 'load data infile ...'. Importing 500K records takes around two minutes - the table has two indexes.

    FTP would do the trick, just right now I dumped a 1300000+ record table to a file 45 megs and I'm ftping it right now.

    gzipped files won't work directly so you'll have to ungzip it on the fly and pipe/redirect to mysql. haven't tried that though, could crash due to little RAM.

    Saludos
    gerardo

      • [deleted]

      Things that speedup loads:

      use innodb tables (isam and myisam are outdated anyway)
      when using innodb tables yoiu can use a transaction to further speed up the load operation and also to make sure that an aborted load operation does not f*ck up your data.

      About the speed itself: remember that if you insert 500k records in two minutes, thats 500000/120=4166 records/second. That's quite fast already.

        Write a Reply...