I have a php script thats does inserts and for some reason now and then it will insert the same record 10 or 15 times. I think that it times out before it finishes and starts over. My question is how do I delete 9 out of 10 identical records so that I keep the one that was supposed to get inserted and delete then 9 others that weren't supposed to get inserted.
You shouldn't be fixing the errors made by your script after they have been made, you should stop the script from making the mistake in the first place.
Show us some of the code that has this strange behaviour and lets see what the problem is.
To expand a little on what Vincent is saying, your database should be designed so that multiple identical records can't happen through the use of constraints in the database layer, such as a unique constraint on parts of your table such that this error is prevented by the database.
I fixed the script after my post. What I asked was if there is a way to delete all(save one record) off the data which was entered more than one time. Oh and I would constrain the table from allowing duplicate records however sometimes there might actually be duplicates. Data in this table is historical it gets entered once a day. I ask my question because my script broke and entered records for say 12-30-00 15 times and I want to get rid of all 12-30-00 records accept one of them.
You could copy the data to a new table using a select destinct or GROUP BY command to get unique records. Trouble is that you say that sometimes there really are duplicates, in which case it is impossible to tell if it is a genuine duplicate or an error.