I have a script that sends and receives XML. It works great for results returned < 200, but unexpectedly quits somewhere after 200 results. 550 results is about 2.5MB so I can's seem to figure out why it's failing. My web host does not have the ability to use the memory_get_usage() function so it is terribly difficult to debug. I get the following error in my error logs:
FATAL: emalloc(): Unable to allocate 20279 bytes
I understand that this is a memory problem, but I can't just increase the memory on a shared server (and I assume there must be a way to use less memory).
My script is quite long, but I will paste some of it below.
Three things which might help: 1.) A way to see how much memory the script is using as it is processing (that doesn't use memory_get_usage), 2.) A simple fix 3.) A miracle
Send & receive the XML
$ch = curl_init();
curl_setopt ($ch, CURLOPT_URL, $query_url);
curl_setopt ($ch, CURLOPT_HEADER, 0);
curl_setopt ($ch, CURLOPT_HTTPHEADER, Array("Content-Type: text/xml"));
curl_setopt ($ch, CURLOPT_POST, 1);
curl_setopt ($ch, CURLOPT_POSTFIELDS, $query);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
$results = curl_exec ($ch);
curl_close ($ch);
Process the XML
$dom = domxml_open_mem($results);
unset($results);
$root = $dom->document_element();
$listing = $root->child_nodes();
// loop through returned rows
for ($i = 0; $i < count($listing); $i++)
{
$elements = current($listing);
if ($elements->node_name() == "listing")
{
$elements = $elements->child_nodes();
for ($j = 0; $j < count($elements); $j++)
{
$node = current($elements);
$node_content = $node->get_content();
$node_name = $node->node_name();
$data[$node_name] = $node_content;
next($elements);
}
// Process the results and insert into a DB
// There is much, much more code here, that I can post if needed
}
}