Hey,

irst of.. is it simultain or simultaneous or simultaneously in this case?

im making the script pafileDB Extreme Edition and i myself dont have any problem with mysql connections because im just making the sript, test it and release it but i dont have a site where 100 people are looking all at the same time, but some people that use my script or the script where my script is based on (paFileD😎 do have those high numbers sometimes.. one of them has that with a godaddy host and that host is complaining about the amount of simultain database connections that his client has..

Now im wondering is there is any way to limit the connections a bit.. so for example that not all users make a connection and that some users use a cache? if so.. how to do it? it must be php 4 and 5 compatible and no software changes... coding changes in the script arent a problem.

Thanx.

    It seems to me that if that person has such high traffic to his/her website, it is time for that person to upgrade webhosting plan.

      could be, but that`s not what i asked πŸ™‚

      EDIT:://

      even if you have your own server.. 100 simultain database connections is overkill so there must be something that can be done to prevent that.

        100 simultain database connections doesn't seem to be overkill to me, at least not if it is a big site. What if google would not manage 100 simultaneous connections? Ok, that example is extreme, but it is still not wrong as an example.

        But to solve your problems I think you can do two things: See if you can make each query do more to avoid unnessecary database access. And close the connection as soon as all database queries have been done.

        As laserlight says it is probably best for that person to look for another way of hosting it. It is normally cheaper to solve the problem that way than it is to have someone (you) to put a lot of time into developing it further.

          but making a good script that can handle those simultain connections wouldnt hurt ;) the queries seem quite effective how they are now so i doubt that i can pull out more.. the only think i can still do (was on the scedule to do anyway) is instead of fetching all the data just fetch the data thats needed... the original script fetched everything which is wrong in my opinion and slowing down the database when you have much to fetch.

            To echo a little bit of what others have said and add in a couple of euros of my own, there are a few things you can do, some of which you may be doing now.

            1. Select statements requesting just the columns needed, no Select * from table.
            2. mysql_fetch_assoc - mysql_fetch_array effectively grabs the data twice.
            3. Terminate the connection with mysql_close, especially if the script goes on to do other stuff with the data before it finishes running.

            You are probably doing all of this already, but as you haven't posted any of your code who knows.

            Blu

              first: WTF.. i replied on the question "Blulagoon" posted and now my reply is gone.. than again.

              1. -> not done but working on it
              2. -> done
              3. ->partly done

              and just to make it sure.. what`s the best thing to do with mysql connections?

              grab the data and close the connection than open it again once new data needs to be inserted, updated or deleted?

              OR

              open a connection and leave it open till the end of your script?

                grab the data and close the connection than open it again once new data needs to be inserted, updated or deleted?

                OR

                open a connection and leave it open till the end of your script?

                Leave it open until the end of the script, or when you know for certain that no more database access is needed for the script.

                  than another example.. when a user makes a vote he is actually submitting something in the database.. same with a comment.. how would that be done? i can`t just put all the queries together when nothing is there to put in.

                    Could you explain further? I do not understand what you are trying to say.

                      well.. all the fetch querys can be bundled together in some variables and after that you can close the mysql connection but update, insert and delete things will only happen when someone clicks on a certain button or link.. now the question is if its wise to close the connection after the fetched stuff and open it again once a user wants to update something? or is it better to keep it open over the entire page?

                        The thing is, when your users "clicks on a certain button or link", they cause a page load. So, chances are the flow of control depends on the exact action.

                        So, I do not see a reason for you to close and then reopen a connection. In fact, since database access is relatively slow, your code may become even slower.

                          oke.. than it`s better to leave it open πŸ˜›
                          thanx for the help so far.

                          any other suggestion on improving the query speed?

                            If transactions are available to you, you should consider using them.

                              ?? never heard of that... what is it?

                                My comment about closing the mysql connection would only be relevant if a script took information from a database and then performed lots of operations on it. For example I did a scipt once where the information was taken from the database, put into arrays and then those arrays were used to draw graphs on the fly.

                                From what markg85 says, it seems the script is dipping into the database throughout its entire course, in which case it wouldn't be helpful to keep opening and closing the connection.

                                  thanx for the reply`s about this subject so far πŸ™‚ and for those transactions.. sometimes it can be so simple πŸ˜›

                                  with the opening and closing.. it seems that i will just have to test both to see if it works faster or not.

                                    Assuming that your PHP code uses only one connection per page (it shouldn't need more), you will never see more MySQL connections than you have Apache MaxClients set.

                                    You should never set Apache's MaxClients higher than your server can sustain. Apache's default settings are designed for a non-PHP configuration, and in many cases or configurations, may be more than your server(s) can comfortably manage.

                                    The trick is to limit MaxClients to what you can manage. If you frequently hit this maximum, consider reducing the keepalive time, or disabling keepalive altogether (not recommended as this increases load on TCP and will make things slower for clients especially if you have lots of images etc).

                                    Memory is usually where the server fails, as if you're using a Unix server with the Apache prefork model (as recommended currently by PHP), a separate copy of PHP effectively loads for each and every client (C code and constant data are still shared, but there is a lot of dynamic data which is created each time).

                                    MySQL does not use a lot of ram for each client, so is not typically a cause of problems.

                                    Of course your mileage may vary on any of this, I recommend developing a proper load testing system on your dev/testing network.

                                    Mark

                                      Write a Reply...