I have about 15 scripts to access the same xml url, to grab different objects from it, parse them and input them to the database.
like the sample
$sxe = new SimpleXMLElement('http://example.org/document.xml', NULL, TRUE);
Individually, they are fine.
I have a masters.php, which just include all these 15 scripts.
include "script1.php";
include "script2.php";
...
include "script15.php";
when I call masters.php, it will run these 15 scripts in turn.
What I experienced is that sometimes, in the middle, say script7.php will fail at the point of construct the SimpleXMLElement. It will give me error like, xml parse error.
But that couldn't be true, due to all these scripts are using the same xml url to create SimpleXMLElement, if script1.php to script6.php are fine. Why script7.php fails?
At first, I thought it might be memory issue. But memory issue shouldn't give the parse error, right? Plus, I increase the memory limit and check the memory_get_usage(); they seem fine.
The problems show when I am on some slow network.
Due to I am from same script (master.php) to make http requests to the same url (max 15 times) in a very short time, could that be the url only return part of the content in this case when the traffic is too busy?
If that is the case, should I use url to create the SimpleXMLElement? Or use file_get_content to get the xml string and then use the xml string to create the SimpleXMLElement?
I don't see why the above two approaches will give different results, just wondering.
Any advices?
I can modify my master.php to get around. But my concern is that when the traffic is too busy on a slow network,
would use url to generate the SimpleXMLElement is a good idea? or use file_get_content to get the xml sting first would make any difference?
thanks!