I have been working on a sort of meta-search script that checks four external sites (for a file with a predictable name) and returns information from them. Sometimes it works fine, other times it takes excessively long to respond and even times out on occasion.
I think the problem is that when one or more of the remote files does not have the requested page (that part is not predictable), the script runs for too long. I doubt my users are going to wait.
Possible solutions: I currently use read() to access the remote file. I have experimented with fsockopen() which has a timeout parameter. However, I relied on the file being put into an array and regex searched line by line. With fsockopen(), I don't know how to get the file into an array (or instead, a good method for efficiently searching the entire file for several bits of information).
Also, I was thinking set_time_limit() might work but reading the manual left me with the impression the entire script would just fail, not just the part with the offending site.
Any ideas on how to get the script to skip non-existent pages would be appreciated. Anyone written any meta-search script examples?