Greeetings:

I have a really nasty problem that I'm stuck on and I'm hoping one of you PHP gurus can help me out.

I am writing a PHP script that fetches a website given a URL the user enters. The script is to run on a website, so I cannot modify php.ini.

The script invokes file_get_contents() to fetch the webpage. I have code to detect a number of errors - invalid URL (input parameter), no content, etc.

The nagging problem is that on some websites, it takes a really looooong time to fetch the page. In fact, the script runs for 30 seconds then dies with a fatal error indicating "maximum execution time exceeded".

I've tried reducing the timeout with set_time_limit(), however, that does not appear to affect how long file_get_contents() waits for the HTTP response.

I can't reduce the wait time (to less than the max execution time) because invoking ini_set() on max_input_wait is not allowed.

Any ideas or solutions would be greatly appreciated.

Peace,
robertoD

    Write a Reply...