Hi,
I'm looking for a way to read and copy all links from a file that is located at a remote server..
Later all links have to be indexed in a small database for my site. Seperated URL, and Description of the Link..
I tried several things with fopen(), and file(), but the only thing that really succeeded was reading a file. For that I used the code below....
$fp = fopen("http://www.php.net", "r");
if ($fp){
echo "connect!
\n";
while ($line = fgets( $fp, 512 )){
echo htmlspecialchars($line)."
";
}
} else {
echo "Can't connect";
}
fclose($fp);
Really don't know what to do.. After reading about 100-200 topics in the forum, I got some info, but I hope to get a more specific answer....
Hoping to get an answer..
Regards,
Tim