I'm having a bit of a problem with a script that runs as part of a larger page - the script fetches a text file from another server, parses whats in it and displays the results.
Sometimes the server on which this text file is located is not available. I use file_get_contents to retrieve this file, with a conditional that if the result of file_get_contents is false, a file unavailable' message will appear. Unfortunately instead when the file is unavailable, the entire page hangs for a good 30 seconds and ends up displaying an error message, effectively rendering the entire page useless. Stranger, is the fact that if i replace the file url with a random url that i know doesn't exist, the script correctly identifies that it doesn't exist instantly.
Here's the code fragment involved:
if ($details = @file_get_contents("http://www.thesite.com/text.txt")) {
// Perform processing
}
else { echo("File unavailable");
}
Is there a way to make file_get_contents time out after a couple of seconds instead of creating a huge delay and error message ? Or is there a simpler way of getting around this problem ?
Thanks in advance.