I am looping over a large number of XML files, parsing them, and adding their data to an Oracle database. After the thing runs for a set time-limit it always and without fail terminates with a WARNING and then a FATAL memory allocation error. the WARNINGs occur on the simplexml_load_string() line:

Warning: simplexml_load_string() [function.simplexml-load-string]: Memory allocation failed : growing buffer in c:\Inetpub\wwwroot\importer0.1\ECCImport.php on line 304

line 304: $sx = simplexml_load_string($buf);

and the FATAL:

FATAL: emalloc(): Unable to allocate 15116551 bytes

to try and combat this, as each iteration of the loop terminates, i explicitly unset($sx) and unset($buf). this should not be necessary due to their reassignment at the top of each loop, but it actually did help a little.

I realize that i can increase memory_limit in php.ini, but i fear this will just postpone the issue w/o me solving it. I believe each iteration of my loop should start with a clean memory slate (and thus consume a predictable and small amount of memory).

Now i am totaly stumped. Is there a memory leak in the function, or is it likely elsewhere in my code? has anyone come across this behavior in their own code with the new XML handling functions?

Thanks,
Mark.

    i'm not at all sure (those errors look pretty unfamiliar) but it seems to me that your XML might be too large. Isn't that 15 Megabytes? i think the default limit is 8 megabytes. how large are these xml files that you're opening?

    a way to get around this if you have root access to the machine via SSH would be to run the script from the command line:

    php -q myscript.php

    i think that will ignore any memory constraints. but i'm not sure.

      sneakyimp wrote:

      i'm not at all sure (those errors look pretty unfamiliar) but it seems to me that your XML might be too large. Isn't that 15 Megabytes? i think the default limit is 8 megabytes. how large are these xml files that you're opening?

      good question! the largest can be around 20 Megs. Funny thing is, it gets through many of them before finally collapsing.

      Thanks for your input. i am going to relax the memory constraints in php.ini just to see if it will get farther.

        well if you are aggregating them all into a single data structure and assigning that to some variable, you are probaby out of luck because you're going to chew up all kinds of RAM on your server. the 8 meg limit may only apply to servers in safe mode:
        http://us2.php.net/manual/en/features.safe-mode.php

        but aggregating multiple 10meg+ files into memory is not gonna fly if you have several users at once regardless of whether you're in safe mode or not. memory hog!

        If you just need to do a one-time import into your database, why not do one file at a time? If you are trying to combine all the data into some weird calculated/generated aggregate that must some compare/reduce/compile the data before storag, try running one file a time and move the calculation/aggregation to sql operations rather than trying to get all the data squared away before entering. sounds to me like you're going to have to work on the data a mouthful at a time rather than trying to swallow it whole.

        if you are expecting to do this kind of thing every time a user visits a website, you are probably going to have real problems. i would advise against that.

          Write a Reply...