Yes, we get attempts at duplicate entries which, obviously, are not successfull.
I don't like the idea of creating an insert and then if we get a duplicate warning doing an update or looping until it fixes itself.
That seems wrong to me. I am a bit of a perfectionist really. I know that locking the tables is probably the right way of doing it from a technical point of view.
There is nothing wrong with the methodology or indexing here, I just was wondering if anyone knew how bad it would affect the performance.
Oh, by the way, regarding the second issue... does anyone know of any decent ways to effectively test perfomance of entire applications? Some way of replicating multiple concurrent site users?
Bye,
Matt B