I've been hashing this one out on the Database board recently, without any luck, but I now have some new info.

I need to run a PHP (containing an sql query) in the background, so that I can display a "query processing, please wait..." page. Some of my queries can take up to 3 minutes, which causes some browsers to timeout, so I would like to display a message that refreshes the browser. I had tried to do this using exec() or system(), but no luck.

Well, it turns out that my host, Yahoo! Web Hosting, doesn't allow these functions, or ANY Execution functions for that matter :0. The only one I've found that they do allow that may provide me with any hope is pfsockopen() (per note in php.net comments), but I can't seem to get it to work properly either.

Does anyone have any ideas on how to call a php from another 'parent' php and let it run in the background, without using any of the Program Execution Functions of PHP (system,passthru,exec,shellexec, etc.)? It does not need to return any values. Could pfsockopen() do the trick?

Any suggestions would be appreciated. Thanks

    I'd like to see your query and data to find out why it is taking sooo long to process. Fixing that would seem to be the better option, and I'm certain that it can be improved, they generally can and 3 mins certainly can.

    I would expect that Yahoo have limited the execution time so a 3 minute query is going to timeout whatever the browser does anyway.

      but some of them can take up to 3 minutes. They all finish executing, but different browsers will time out after different periods of waiting, some IE6 after only 100 seconds.

      There is a good bit of number crunching going on, and multiple tables each with 1000s of rows (one has over 100,000 rows) so I think I've done pretty well in optimizing the queries. Most are under 10 seconds.

      The question still stands, any idea how to run a PHP in the background?

        Still doubt that any query needs to take 3 minutes in mysql. 100,000 of rows is nothing.
        We were discussing queries over code in another thread and someone ran a benchmark test - aggregate query to calculate the mean value with 40,000 rows took under a second and was 4 times faster than in code. When he upped it to 350,000 rows it still took under a second but php took 6 seconds.

        So, I say again, post the 3 minute query here and I know I can rewrite it to cut the execution time by masses. Even if you got around the browser timeout, waiting 3 minute for a page is a lousy user experience. I know I would not expect anyone to wait that long. They must be a captive audience if they do wait.

          Exactly, I remember having some trouble with a 30 seconds timeout (default max_execution_time in php.ini)
          but it was with a script creating a whole database with something like 400.000 INSERT queries !
          So doing a SELECT on 100.000 records should really be quicker 🙂

            Looking at this page, it should take MySQL around 6 minutes to read 2 MILLION rows by an index.

              And the only reason it takes that long is because 2 million records takes up a lot of space on the hard disk and causes a lot of large head-movements, which is the slowest part of all disk I/O.

                Also look the date at the bottom of the page :

                quote 2002-2003 Active-Venture.com Small business hosting [/quote]a bit old, isn't it ?

                  Ajax-PHP, I recently tried a very simple framework for this, Ajax-Agent.

                  Perhaps it the workload could be done in steps, giving the client some respons after , lets say, half or one third of the workload?

                  Creating a loading screen i simple, and also having different threads work "in behind" , however none of these will allow you to go over the maximum timeout. Check this example and let it run for a while and maybe the error will produce itself.

                  Ajax Forum stock chart ticker example

                    It's possible to do the following:

                    1. With careful respect to locking (I used a lock file), create a PHP child process. Have this read the task from a file or something. Make very sure that you don't create more than one instance of this process (You don't want a situation where there are 10 worker processes running atfer some idiot clicks refresh repeatedly on your page)

                    2. Wait for the results to arrive. Make a PHP script which attempts to lock a file held locked by the worker process, with a timeout. AFter a timeout of, say 10 seconds, it continues and reloads the page. You could also show progess if you have these data. The page will reload immediately and then wait for another 10 secs etc.

                    3. When the task is finished, the worker process exits, releasing the file lock, so that the web page can continue.

                    As long as you make absolutely CERTAIN that you can't run more than one instance of this job at once, you're fine. Otherwise, you risk the possibility of creating too many processes.

                    I saw this once on an old Unix system; our application had a report which took several minutes to run. The operator thought it wasn't working, so tried again. And again, and again. Soon the whole system was cripplingly slow. I logged in remotely and saw the many processes, killed them, and it all came back to life.

                    Mark

                      I do acknowledge the fact that my query may be in need of optimizing. I appreciate the offer for help, but I warn you, it is a beast. I've sent it to you in 4 parts, due to 2000 char max on each message. Yeah, I know, woohoo.

                      Let me know if you recieved it, and what you think I can do. Thanks a bunch. 🙂

                        Write a Reply...