I have a php script that reads a file (using file() or gzfile()) into an array. Each array contains a pipe deminated string. Then I attempt to explode each array element and place in a multidimensional array for reading...
$file_array = gzfile("bigFile.txt.gz");
for($i=0;$i < count($file_array);$i++){
$product_info[$i] = explode('|',$file_array[$i]);
print("$i<br/>");
}
This has been tested and works fine for files with 100,000+ array elements and a file size of 7Mb. When I attempted to read a larger gz file of 170Mb and 1.1 million array elements, the script reads approx $i=140,000 to 178,000 and then silently ends. I inspected the error log files and found nothing.
My server and php has a timeout of 570 years 🙂 so I know it shouldn't be a timeout issue... is there a limit to the array size? The first array of strings is created but when I attempt to explode them into a multidemensional array is where I get problems.
I tried skipping ahead in by starting the explode loop at the $i=200,000 index but reached the same result... only 130,000 to 170,000 elements exploded into the multidimentional array before the script quits.
If I limit the explode loop to 100,000, it works fine.
?????
at a total loss!
---- Stuff ---------------------
PHP Version 4.3.3 www.entropy.ch Release 1
max_execution_time 30000000
max_input_time 60000
open_basedir no value
output_buffering 4096
output_handler no value
post_max_size 8M
upload_max_filesize 2M
HTTP_KEEP_ALIVE 300
Apache:
Apache Version Apache/1.3.26
Apache Release 10326100
Apache API Version 19990320
Max Requests Per Child: 100000 - Keep Alive: on - Max Per Connection: 100
Timeouts Connection: 3000000 - Keep-Alive: 15
Thanks for any help
-John