Hi, for part of my project there's this:
// Look for top link and look for links there
$s = 0;
while($ii < $result)
{
$newcontent = File_get_contents( $matches[$s][2] );
// Put the new links into new array
if(preg_match_all("/$links/siU", $newcontent, $matches, PREG_SET_ORDER )) {
foreach($matches as $match)
{
}
$result = count($matches);
echo $result;
}
// Display the new array elements
echo "<pre>";
$i = 0;
$o = 0;
while($i < $result)
{
print_r($matches[$o++]);
$i = $i + 1;
}
$newcontent = $matches[$s++][2];
echo $matches[$s++][2];
}
?>
This loops like a webcrawler, but it stops at URLs it cannot access, like: '\something\' so would anyone be able to point me in the direction of doing an event if the URL is inaccessable?
thanks