I'm running a cron job which generates reports online, and then saves them out as static HTML pages. In most cases, this works just fine. But the one client has so much data, it takes me 9 minutes to pull up the one page in the browser.
But when I go to that same URL w/ PHP and try to get the contents of that page, it continues to time out on me. Is there something else I can use, or something I can add to extend the processinng time? I've already set the max_execution_time and used the set_time_limit function on that page. I did a Google search for this kinda thing (as well as a search here on the board) and I found some references to fsockopen, but I haven't used that before and I'm not sure that's really what I need for this.