Hi,

I'm hosting my application both on win xp and fedora and a certain part of my web application needs to access an external program through any of the execute functions in php...exec...popen...shell_exec..system... etc

Both systems are hosted using apache.

The problem is, when calling open office (i'm sure any other external programs will have similar problem) multiple times through one of the functions, the process will hang when i check on it. This doesn't happen in windows hosted apache.

For example,

$cmd = "sh /usr/local/apache2/myproject/test.sh /path/to/file.htm";
exec($cmd, $output);

test.sh consist :

#!/bin/sh
/opt/openoffice.org3/program/soffice.bin -invisible -headless -nofirststartwizard "macro:///Standard.Conversion.ConvertHTMLToWord($1)"

Basically, the script just calls open office to convert my files into word documents. If i invoke the process once in a while it's okay, but if i do a page refresh like 5 times in 2 seconds, the processes will hang and i have to kill it manually.

Any ideas why? or is the exec function not suitable for linux?

    6 days later

    hi i hate bumping my own threads but, is there really no one here with experience in calling external programs through php in linux?

      thanks alot sneakyimp, i've read through it but though it's good to be able to retrieve its process ids in php, i still can't see how it would assist me in the task.

      sure i could write scripts to check running PIDs and kill them, but if i do this users of my web would not be able to receive the converted documents upon clicking links if processes are always killed if there are too many users using it (assuming i had to kill it cause its hanging up)

      ( my case, refresh page 5 times within 2 seconds will pile up hanged processes in fedora )

      it's just peculiar. The behaviour on fedora is different..when i refreshed my page 50 times in windows apache, it would generate 50 files. Using the same code, linux doesn't seem to care how many times i refreshed my page, it'll still output 1 file (if the process doesn't hang), and generates no file at all if the process hangs.

      I can't believe i've been looking into this problem for more than a week!

        i'm beginning to think maybe somewhere inside linux is controlling the number of processes per daemon or user that can be executed thats why it hangs on 5th process each time.

        does linux have such controls implemented ? are there any ways for me to have my processes unlimited or set priorities for them to finish their task quickly?

          Write a Reply...