In theory, can I do the following and then check header response code for each url and update the url as indexed?
$cnturl = mysql_query("SELECT * FROM DATABASE WHERE `indexed`='0'");
if (mysql_num_rows($cnturl)){
do {
$geturls = mysql_query("SELECT * FROM DATABASE WHERE `indexed`='0'");
// CURL RESULT PAGES
// this is the function called. If the header response for the url == 200 then url gets updated in the database as `indexed`='1' if not then url is not update, therefore do / while
$rc = new RollingCurl("get_urlinfo");
$rc->window_size = 10;
foreach ($geturls as $trpage) {
$request = new RollingCurlRequest($trpage);
$rc->add($request);
}
$rc->execute();
} while(mysql_num_rows($geturls));
}
PLEASE - any feedback greatly appreciated. I think this will work nicely, but just want to ask the experts 🙂 I have 870 urls on one site I have to query as fast as possible and make sure all pages are actually gathered