I have a content distribution script that I wrote that works beautifully, handles just about everything I can think of.
Code, docs, etc at:
http://www.shrum.net/code/cats
Now I'm to the point where I want to stress test it and the first thing I saw was a rapid decline in output speed when I increased the number of entries that I wanted it to process.
This seemed strange to me as the operations were the same for all the records.
System config can be found here:
Here's what I'm seeing (time is inclusive of entire script; start to finish...not just MySQL query time):
50 entries = 0.5231 secs ~ 96 per sec
100 entries = 1.2029 secs ~ 83 per sec
200 entries = 4.2847 secs ~ 47 per sec
300 entries = 11.5797 secs ~ 26 per sec
400 entries = 18.9632 secs ~ 21 per sec
Is there anything I can do to make this faster? Or is this a issue with the OS / hardware?
I'm not asking for a full-on code review (yet)...I just want to be able to do at least 1000 records without having such a bottle neck. I'm hoping it's just an uneffecient function call or something simple like that.
TIA