I have a script that reads in urls from a database table. I use the HTTP_Request() ( from Pear ) to get a response code for each url. I have one or two urls that are timing out on me and returning a non-fatal php error.
I've have this: ini_set('max_execution_time', 200); I do not want to raise it any more.
I'd like to add some timing feature to my $req->sendRequest() statement that would allow this to run for say 10 - 20 seconds and if no response is returned, then the code drops this request and continues with the next url in the list.
Any thoughts?
Here is my code, thus far.
while ( $urlData=$results->fetchRow(DB_FETCHMODE_ASSOC) ){
$url = $urlData['HL_URL'];
if ($url) {
$req =& new HTTP_Request($url);
// this next line is where I need a timing element
$req->sendRequest();
$code = $req->getResponseCode();
switch ($code) {
case 200:
echo "Everything's ok\n";
break;
case 300:
echo "Permanent Redirect\n";
break;
case 404:
echo "Document not found\n";
break;
} // end switch
} // end if
} // end while
Thanks for any ideas
jdc44--