Hi Guys,
I have a massive datafeed 400,000 + rows
I need the fastest least server intensive way to insert this data in to multiple database tables.
I have looked at load data infile but my problem is the csv feed holds data which needs to be split into serveral tables and by breaking up the feed I would then have to loop through 7 x 400,000 + rows or do 7 bulk inserts of 400,000 rows thus taking more time and killing the server.