I have a problem using the stream filter functions on large files. When reading through say a 1.5 Gig text file the memory is quickly exhausted. Consider the following script:
$fp_src=fopen('file','r');
$filter=stream_filter_prepend($fp_src, 'convert.iconv.ISO-8859-1/UTF-8');
while(fread($fp_src,4096)){
++$count;
if($count%1000==0) print ftell($fp_src)."\n";
}
As it progresses it eats up more and more memory. I was showing over 200 MB being used after processing through just 35 MB of the file. Rerunning the script w/ out the stream_filter blasts through the file with a constant memory footprint of about 10 MB.
What gives? Is this "expected behavior"? Anyway around it?