I have a script that reads in urls from a database table. I use the HTTP_Request() ( from Pear ) to get a response code for each url. I have one or two urls that are timing out on me and returning a non-fatal php error.

I've have this: ini_set('max_execution_time', 200); I do not want to raise it any more.

I'd like to add some timing feature to my $req->sendRequest() statement that would allow this to run for say 10 - 20 seconds and if no response is returned, then the code drops this request and continues with the next url in the list.

Any thoughts?

Here is my code, thus far.

while ( $urlData=$results->fetchRow(DB_FETCHMODE_ASSOC) ){
$url = $urlData['HL_URL'];
if ($url) {
$req =& new HTTP_Request($url);

		  // this next line is where I need a timing element
		$req->sendRequest();

		$code = $req->getResponseCode();
		switch ($code) {
			case 200:
		        echo "Everything's ok\n";
		        break;
			case 300:
				echo "Permanent Redirect\n";
				break;
		    case 404:
		        echo "Document not found\n";
		        break;
		} // end switch
	} // end if
} // end while

Thanks for any ideas
jdc44--

    $req =& new HTTP_Request($url, array('timeout', 10));

    The timeout is measured in seconds.

      Thanks for the help, but I am still timing out. But, now it is hitting my maximum execution time limit of 200.

      Can you point me to any documentation that mentions the second argument to the sendRequest() function. I copied it exactly as you posted it. Was there any other code I need to add to my script to make this work?

      Thanks again for your help.

        sendRequest() doesn't appear to take a second parameter, but I assume you meant the class constructor.

        The api documentation is in phpdoc format within the actual file. The latest version of this doc isn't available on the pear website, but the prior version is. Honestly its not well formatted, and is easier to read in the actual source file.

          Here's the list from the source file for version 1.2.4.

          
          /**
          * Constructor
          *
          * Sets up the object
          * @param    string  The url to fetch/access
          * @param    array   Associative array of parameters which can have the following keys:
          * <ul>
          *   <li>method         - Method to use, GET, POST etc (string)</li>
          *   <li>http           - HTTP Version to use, 1.0 or 1.1 (string)</li>
          *   <li>user           - Basic Auth username (string)</li>
          *   <li>pass           - Basic Auth password (string)</li>
          *   <li>proxy_host     - Proxy server host (string)</li>
          *   <li>proxy_port     - Proxy server port (integer)</li>
          *   <li>proxy_user     - Proxy auth username (string)</li>
          *   <li>proxy_pass     - Proxy auth password (string)</li>
          *   <li>timeout        - Connection timeout in seconds (float)</li>
          *   <li>allowRedirects - Whether to follow redirects or not (bool)</li>
          *   <li>maxRedirects   - Max number of redirects to follow (integer)</li>
          *   <li>useBrackets    - Whether to append [] to array variable names (bool)</li>
          *   <li>saveBody       - Whether to save response body in response object property (bool)</li>
          *   <li>readTimeout    - Timeout for reading / writing data over the socket (array (seconds, microseconds))</li>
          *   <li>socketOptions  - Options to pass to Net_Socket object (array)</li>
          * </ul>
          * @access public
          */
          function HTTP_Request($url = '', $params = array())
          

            Am I correct in thinking that the line:
            $req = new HTTP_Request($url, array('timeout'=>10));
            is supposed to set a time limit of 10 seconds when $req->sendRequest() is called?

            Is there a test I can use to say, "okay, if this request takes more than xx seconds, get the next $url to test"?

            I did add a few print statements to test see what was happening and where. This led me to use trim($url) of all things.

            $url = trim($urlData['HL_URL']);

            $req = new HTTP_Request($url, array('timeout'=>10));

            Now I get quite a few more printouts before it errors. I'm getting through 31 od 4800 urls I need to run through this loop. Sigh!!

            My Error:
            Non Fatal PHP Error: (2) fsockopen(): unable to connect to 165.2.141.65:80 at D:\php\pear\Net\Socket.php line 106

            It's bombing on this url: http://www.bristolhotels.com/

            Thanks again for your tips. I'm learning!!

              Am I correct in thinking that the line:
              $req = new HTTP_Request($url, array('timeout'=>10));
              is supposed to set a time limit of 10 seconds when $req->sendRequest() is called?

              That is correct. The only gotcha there is that the timeout parameter only applies to connection time. You can also set a read timeout, like this:

              $params['timeout'] = 2;
              $params['readTimeout'] = array(8, 0);
              $req = new HTTP_Request($url, $params);
              

              I separated out the parameters to make it more readable. That sets a connection timeout of 2 seconds and a read timeout of 8 seconds.

              Is there a test I can use to say, "okay, if this request takes more than xx seconds, get the next $url to test"?

              You mean something generic that you could apply to any function? Nothing that I know of. Sounds useful though. 🙂

              My Error:
              Non Fatal PHP Error: (2) fsockopen(): unable to connect to 165.2.141.65:80 at D:\php\pear\Net\Socket.php line 106

              It's bombing on this url: http://www.bristolhotels.com/

              Not surprised it can't get bristolhotels.com, since my web browser also times out on it. Anyway, it does say this is a non fatal error. Your script shouldn't have stopped executing just for this. Perhaps it coincided with the script timeout?

              I noticed earlier you said you had set script execution to 200 seconds. With 4,800 urls to hit, that means each url has to finish in under 0.05 seconds. I think you may need to try less urls or give it more time to do them.

              By the way, instead of doing ini_set you might want to look at the set_time_limit function.

                Wahoo! I got this working.

                I got into the error handling methos for the pear http_request stuff. I simply add the following to my code:

                In part...

                $req =& new HTTP_Request($url, $params);

                // added this
                $code = @$req->sendRequest(); // bypass our error handlers
                if( PEAR::isError($code) ){
                echo "<P>Cannot connect to $url";
                }else{
                switch - statement here to process codes....
                }

                Thank you for all your help. It got me pointed in the right direction.

                  Write a Reply...