This would mean the PHP script would have to gather all the POST data (which contains file uploads) from the submitted form before it can begin to open a connection to the Perl script. If the form contains a few megs of data, then it will take a while for the PHP script to gather all that data into it's $_POST hash before then calling the Perl script.
Whilst this solution would certainly work, the timing of the form submission is key to this Perl script (again, something I should perhaps have mentioned before 🙂 ) because it needs to immediately start reading the STDIN from the http request generated by the form submission.
Basically, I'm using the "Mega Upload" Perls scripts as provided by http://www.raditha.com/php/progress.php - with a few minor modifications to work within my own system.
This fancy little script reads the raw POST data via STDIN and as it does so, creates progress files that can be read from an external script that generates a progress bar from that info. If PHP could start reading from STDIN straight away before packing everything away into the $POST/$FILES hashes then I could've written a PHP equivalent, but as far I can find this just isn't supported (not via "php://stdin", "php://input" or "$HTTP_RAW_POST")
So, at the moment I'm using Apache's "SetEnv" directive to ensure that an environment variable is created that both Perl and PHP can read. This is placed at the bottom of "httpd.conf", which means it sets this variable for every site ...
...
SetEnv MY_TMP_DIR /tmp
...
... but I suppose you could just as easily place it in the "<VirtualHost ... />" blocks to create a variable that only exists for that host, or in an ".htaccess" file.
This then makes $ENV{'MY_TMP_DIR'} available to Perl, and $_SERVER['MY_TMP_DIR'] available to PHP - perfect!
In the end this is an Apache solution rather than a PHP one, and although the other solutions you've mentioned would work well in other circumstances, this is the only way I can currently think of for my (very exact) requirements!
Thanks again 🙂