If anyone can give your thoughts on this problem, it would be greatly appreciated:
I'm working with mysql to manage some formatted text files from an older project.
The database has 1 TABLE with 14 COLUMNS that at every given time will only have appr. 5000 ROWS.
The text files only contains 12 COLUMNS data from the table and the other 2 COLUMNS are custom fields edited only on mysql table.
Each text file has an average 200 records, usually between 70/100 with one or two topping 400.
What i'm trying to do is to synchronize this table with the external files. I can't lose the custom fields from the 2 COLUMNS of the records already on the Table.
So far i have thought on working this by:
Loading the text file into memory (Array)
Explode the ID field from the Array in the form of '123','234','657'
Do a 'DELETE FROM table WHERE ID NOT IN values_above'. This will remove records deleted from the text fields.
Using the same fields Array do a Loop to UPDATE the existing records. UPDATE FROM table SET 1..12 COLUMNS WHERE ID=Array[ID]
This way i will preserve the 2 custom columns.
How about inserting new records not yet on the table? I could do another loop to test if a ID is on the table and if not do a INSERT, but i'm searching for a better solution for this all problem.
Using the above solution, i'm having for a text file with 150 records at least 301 queries to synch this data that can reach 451 queries if all the records from the textfile aren't yet on the table.
Please post your comments on this, many thanks!
PS: If i didn't have 2 custom fields on the table i could just re-format the text file, do a DELETE * WHERE ID IN ID's and then do a LOAD FILE.