I am using nusoap-0.9.5 library to access data through a web service. I need to grab large amount of data (around 5M😎. But the problem is that after about 4 minutes of connection through the web service it returns and shows nothing. That is the page is blank and the browser status bar shows "done". No fault is displayed either. Below is my code.

require_once('lib/nusoap.php');
$ini = ini_set("soap.wsdl_cache_enabled","0");

try
{
$wsdl="http://www.<mywebserviceurl>?wsdl";

$client=new soapclient($wsdl, 'wsdl');
  $param=array('Konto'=>'********', 'UserId'=>'<myuserid>', 'Password'=>'<password>');
  $result = $client->call('ProduktListe', $param);

if ($client->fault) 
{
    echo '<h2>Fault</h2><pre>'; print_r($result); echo '</pre>';
} 
else 
{
    $err = $client->getError();
    if ($err) {
	    echo '<h2>Error</h2><pre>' . $err . '</pre>';
    } else 
    {
	    echo '<h2>Result</h2><pre>'; print_r($result); echo '</pre>';
    }
}
echo '<h2>Request</h2><pre>' . htmlspecialchars($client->request, ENT_QUOTES) . '</pre>';
echo '<h2>Response</h2><pre>' . htmlspecialchars($client->response, ENT_QUOTES) . '</pre>';
echo '<h2>Debug</h2><pre>' . htmlspecialchars($client->getDebug(), ENT_QUOTES) . '</pre>';

}
catch (Exception $err)
{
echo $err;
}

Please help.

    Are you sure your WSDL is properly accessing the gateway and that the gateway is making an attempt to cough up the data? The timeout issue could be either due to the download timing out or due to the fact that your server attempts to connect to the remote url and the connection times out due to lack of permissions or something.

    Is your script outputting anything at all?

      the default time out for a php script not run on the command line is 30 seconds.

        Yes i can access other soap methods from this service and get the data. For instance if i use:

        $result = $client->call('ProduktInfo', $param);

        where

        $param=array('Varenummer'=>'1111aaa22', 'Konto'=>'********', 'UserId'=>'<myuserid>', 'Password'=>'<password>');

        it successfully gets single record for the product id: 1111aaa22.

        Secondly, previously i was getting timeout error and i had set max_execution_time = 360 and memory_limit = 256M in php.ini and it was gone. Then it gave HTTP Error: socket read of headers timed out. Which i fixed by setting $timeout and $response_timeout to 360 all over in nusoap.php and this issue was also gone. Do i still have to increase the timeout to like 10 minutes?

          sneakyimp;10967343 wrote:

          Is your script outputting anything at all?

          Nothing is output and the browser's status bar shows "done" after 3 or 4 minutes. "view source" shows simple html like <html><head>...</html> and no data in between. One more thing. The "Networking" tab from the Taskbar shows the activity of data retrieval and it shows like 3.5+ MB of the data fetched during that period.

            wouldn't surprise me if its the remote server choking, it could be quite a load with a lot of people getting that much data. I would never allow that much data or a connection for that long to any of our webservices. who is the connection to?

              Its a product database from an office equipment business who publish this web service for their affiliates. In fact this web method is supposed to run infrequently, like once a week to update the local database.

              BTW when i use this service directly through the browser it does fetch the data and start showing the result xml in a few seconds and keeps getting and updating till 4 to 5 minutes.

                10 minutes to download 4 or 5 MB is really slow. Sounds like the provider needs to switch to something like Amazon EC2 for their hosting!

                You may have some luck in eliminating timeouts (or increasing them substantially). If the setup works sometimes and not others, I think Dagon is right that it's probably the remote server. Another possibility is that your server has limited bandwidth -- Are you running on a shared hosting setup or do you have a dedicated machine?

                I'm also wondering if you run this in response to user input or if it's an automated task.

                  Above i said this:

                  The "Networking" tab from the Taskbar shows the activity of data retrieval and it shows like 3.5+ MB of the data fetched during that period.

                  This much data 3.5+ MB is fetched within just 1.5 minutes then there's no data retrieval shown although browser displays that processing underway till the <timeout> settings and then goes blank.

                  For now i am testing to connect the webservice from my localhost. I have installed wamp server.

                    dagon sort of hinted at the idea that exchanging large amounts of data like this via a web service may not be the best approach. I'm not sure why the gateway would cough 80&#37; of the file in the first 1.5 minutes and then fail to deliver the remaining 20% in 3 more minutes, but it sounds like a server-side issue to me -- meaning the remote service.

                    If you request a 3.5 MB file from localhost, it arrives instantly, so the bottleneck is likely that the remote server is not coughing up the data quickly enough. It's been my experience that when you cough up large files from a web server, you should first transmit a content-length header -- this helps prevent timeout behavior and results in more meaningful progress bars and such. I have no idea if the remote server is doing anything like this. I have no idea if NuSoap is responding to such headers.

                      Write a Reply...