We have written an online web based resource management game that
currently has in excess of 16,000 players and as many as 1,000+ players
online at any given time. We are having some serious performance issues
at this point (we COULD have more than 1,000 players but we had to limit
it because of the performance). Just about everything the user does goes
through this logic:
1) Open the database.
2) Perform a SELECT on the database to verify the player identity.
3) Close the database
4) Depending on the action the player requested:
a) Open the database
b) Perform a SELECT/UPDATE/INSERT depending on the players action.
c) Close the database.
5) Exit
Now, the questions are:
1) Does PHP/Pear DB cache the opening/closing of databases? Is there a
performance hit by having multiple database opens/closes?
2) Will PHP/Pear DB close an open database connection when my script exits
if I forget to?
3) Is there a way to have a persistant database connection for each user?
What would happen if the player just closed their browser instead of
logging off? Would the connection stay open occupying db handles?
4) Is there a way to have PHP open a bunch of persistant db connections
and have all of the scripts (players) use the master db connections?
Would this be more of a performance help, or a performance hit?
5) We are currently using MySql, but as we've built the game using Pear DB
we have the ability to easily switch to Postgresql. Which one performs
better under loads like this? What are the benefits/restrictions of
each?
6) Any suggestions on a way to optimize this scenario without compromising security?
7) Kind of off the topic, but we're using templates for everything (Xtemplate
to be specific, I love it!). Is there anyway to PHP process a template?
For example: We load the template, parse it, and then we want to have
PHP interpret the results of our template... 🙂