I am getting desperate to solve a major issue we are having with a content delivery system that we use to deliver content across several different client sites....
The code we place on a clients webserver which grabs the data from our servers and displays it on their server fails about 1 in 50 tries which is clearly not acceptable
CODE
error_reporting(0);
function fetchUrl($url)
{
global $ErrorCount;
$numberOfSeconds=30;
$url = str_replace("http://","",$url);
$urlComponents = explode("/",$url);
$domain = $urlComponents[0];
$resourcePath = str_replace($domain,"",$url);
$socketConnection = fsockopen($domain, 80, $errno, $errstr, $numberOfSeconds);
if (!$socketConnection)
{
$ErrorCount++;
}
else {
$data = '';
fputs($socketConnection, "GET /$resourcePath HTTP/1.0\r\nHost: $domain\r\n\r\n");
while (!feof($socketConnection))
{$data .= fgets($socketConnection );
}
fclose ($socketConnection);
} // end else
$data=substr(strstr($data,"\r\n\r\n"),4);
if (!$data)
{
$ErrorCount++;
}
else{echo"$data";}
return($data);
} // end function
$ErrorCount=0;
$filename=("http://www.somewhere.com/file.php");
fetchurl($filename);
if($ErrorCount==1){fetchurl($filename);}
if($ErrorCount==2){fetchurl($filename);}
if($ErrorCount==3){fetchurl($filename);}
if($ErrorCount==4){Echo"Can't Connect ---Please REFRESH";}
The problem seems to be in the making of the connection from thier servers back to our servers. But I was wondering if anybody knew of a better or different way to do this.
I will also add that our servers are behind both a FIREWALL and a device that balances network traffic over two lines(T1 and CABLE MODEM)
Any help,advice or direction will be greatly appreciated.
Thanks