Im writing a script that has to download over 100 different web pages and then store the information in a database...problem is it stops downloading after a while (probably timed out or something)
The code below only echos up to 92 even though it should do it 100 times. Google is just an example, im downloading a different page every time. But this just shows the problem.
<?php
for ($i = 0; $i <= 100; $i++) {
$strPage = file_get_contents("http://www.google.com");
echo "$i<br>";
}
?>
I don't know any other way to do this other than redirecting the page to itself 100 times. Is that the only way?
thnx, yea im a noob. :quiet: