Ok, here's the deal:
I'm writing a program that functions similarly to a search engine spider. It opens up many many many remote locations and takes the needed data off of them and compiles it in a mysql database.
Currently, I have the time limit set to 800 seconds to combat the php default of 30 seconds, but this created a new problem:
There is a concern that this will overload the web sever and put a strain on the bandwidth. I want to make a user defined script-throttle of sorts to make automated pauses (in seconds) after the loading of each remote location and before the loading of the next.
Anyone have any suggestions? They would be appreciated.
(and even if it does not overload the server, I STILL need to have the userdefined throttle. )
thanks