Hi,
This is really a request for thoughts by server administrators or people with good unix knowledge. (However, everyone's opinion will be appreciated).
I have successfully taken a lot of steps to secure / establish a working linux server with all necessary protocols and agents. I have also been able to test small php applications which open pipes to files and write / read data from them.
The problem isn't execution, it's figuring out the best way to take the next step.
I have to :
->Write a multi-user web based management tool which will edit server files.
The problem is, with multiple users adding and deleting entries from 'qmail / ftp' configuration files, I need a system which can lock out files under 'process' and queue user requests for later processing.
This is my plan :
Write user requests to 'individual_user_queue' file. Have cron loop through a directory and read the file entries one by one and execute them.
Run cron every 10 minutes or so in the background to accomplish this.
I am sure others have written many programs like this, and measures must have been taken to avoid sticky situations. I am simply open to any suggestions / ideas you may have.
Thank you for reading this looong post 🙂
Regards,
-m.