I don't use mysql_pconnect in my DB connection code but somehow I found that I had reached too many connections and couldn't get any data on my site off the DB for any page.

I'm not even sure it can be the cron.job but it was one suggestion on line. The job was only running every day and making a quick DB access which would have returned 0 results and given up.

Either that or too many people are connecting to my site. Either way, I'm a bit stumped. I've used this same connection routine on a bunch of different little sites I've built but this is probably the first to quite as popular.

function db_connect()
{
	$result = mysql_connect('localhost', 'xxxx', 'xxxx');

if (!$result)
	return false;

if (!mysql_select_db('xxx'))
	return false;

return $result;
}


function db_query($query, $db_conn_result) 
{
	if(!$db_conn_result)
		if(!db_connect())
			return false;

return mysql_query($query);
}

A call is made to db_connect() before any query and then the query is run using db_query()

After I saw this error I altered the db_connect() function like so:

function db_connect()
{
	$result = mysql_connect('localhost', 'xxxx', 'xxxx');

if (!$result)
    {
            mysql_close($result);
	return false;
    }

if (!mysql_select_db('xxxx'))
    {
            mysql_close($result);
	return false;
    }

return $result;
}

and the error went away but I have no idea if it was because of this or not?

    Hey, I'm getting this again. And it's not the Cron job. Why would it be happening?

    Can anyone offer any advice?

      First off, you do risk creating several db connections per user.

      $cr = db_connect();
      # some code ...
      
      # you run the risk of misspelling your connection result variable name
      db_query('SELECT ...', $vr);	# = new connection...
      
      # referencing local instead of global variable
      function stuff() {
      	db_query('SELECT ...', $cr);	# $cr is local variable = new connection
      }
      
      $cr = false;	# could happen by "accident"... 
      db_query('SELECT ...', $cr);	# after the above line, every subsequent query will create a new connection
      

      If you instead create a class to handle this and only use this class to perform queries, there will be no risk of multiple connections per user. One simple example

      class Db {
      	private static $conn = false;
      
      public static query($qry) {
      	# There is no connection, try to create one. This can only happen once, so no risk of multiple
      	#  db connections per user. Unless you call mysql_close() somewhere, which you shouldn't.
      	if (!self::$conn) {
      		self::$conn = self::connect()
      	}
      	# this failed, which means a connection to the DB can't be established.
      	if (!self::$conn) {
      		# can't connect
      		error_log(mysql_errno() . ': ' . mysql_error());
      
      		# probably a good idea to throw an exception here, but at the very least...
      		return false;
      	}
      	$r = mysql_query($qry, $conn);
      	if (!$r) {
      		# centralize error handling here. e.g.
      		error_log('Query failed: ' . $qry . PHP_EOL . mysql_errno() . ': ' . $mysql_error());
      
      		# possibly throw an exception here as well...
      		return false;
      	}
      	return $r;
      }
      
      private static function connect() {
      	$result = mysql_connect('localhost', 'xxxx', 'xxxx');
      
      	if (!$result) {
      		# can't call mysql_close(false) - and $result is false here
      		return false;
      	}
      	if (!mysql_select_db('xxxx')) {
      		# this should also be logged. another place that might be good for an exception
      		mysql_close($result);
      		return false;
      	}
      	return $result;
      }
      
      # if you really need to be able to close db connections, i.e. you run a query early in the script
      # and then don't need to do any more queries while the rest of the script takes a long time to run
      publis static function close() {
      	if (self::$conn) {
      		mysql_close(self::$conn);
      		self::$conn = false;
      	}
      }
      }
      

      Usage

      $result = Db::query('SELECT ...');
      
      $result2 = Db::query('SELECT ...');
      
      # and if you want to close the connection
      Db::close();
      

      The other thing that can easily become a problem is inefficient queries. If your queries take too long to perform, then each user will be taking up his/her connection a lot longer, which means that you are more likely to run out of connections.
      Use explain to see the query execution plans for your queries, and take use of the slow query log if needs be.

      You may also have a badly configured db server.

      As for the chron job, they can keep running forever since only php scripts run through a web server are affected by max execution time. So if you for some reason end up with
      chron script

      $conn = db_connect();
      $r = db_query('SELECT ...', $conn);
      
      while (true) {
      
      }
      

      ... then your script will never end, so even if you start a new just once per day, you will add one more indefinitly running script per day. It's easy to add logging of script start and end times though, to make sure that they do end, and don't take too long to run.

        TheoGB wrote:

        Why would it be happening?

        What is the PHP error message that's getting logged?

        You are logging all PHP error messages, right?

          Interesting.

          Right, I don't know if I'm logging PHP errors. I will check. I questioned my hosting company and they suggested I changed to using a Persistent connection. Originally I had assumed this was precisely the thing that would cause the issue (a persistent connection) but I've made the change and it's working okay.

          However, I see your point now about the classes.

          I'll have a look at making the change you're talking about.

          As far as mysql_pconnect() vs mysql_connect() I'd be interested about your thoughts on this too.

          Thanks for the help.

            I don't know about the full implications of using pconnect, since I have never used nor needed it. We are using ordinary connections, and our community has ~20 million page requests per week, which is roughly 33 requests / second.

            I do however know that persistent connections can lead to problems. An ordinary connection is always closed when your php script ends. A persistent connection will live as long as your web server, or other intermediary server managing the persistent connection pool, lives. I do not know wether one users connection may be shared with another user during the processing time for the first user's page request or not, and I suppose that could be implementation dependent.

            However, there are things that are local to each connection, such as LAST_INSERT_ID / mysql_insert_id, variables (SET @var = ...), temporary tables etc. Also, transactions an explicit table locking that are not properly dealt with programatically will go on for as long as the connection lives, which means you have an added risk of blocking all other connections.

            As such, I would not resort to pconnect, but it should be possible to get more information by googling around a bit. pconnect vs connect is one idea that comes to mind. Reading user comments in the entry for mysql_pconnect at www.php.net might shed some more light on the issue, although not every user supplies good or accurate information...

            Also, you should look into using mysqli (mysql improved) rather than mysql. Among other things, it has API support for prepared statements. Also, if you do decide to go with persistent connections, mysqli can provide you with automatic cleanup before a connection is reused., see http://dev.mysql.com/doc/refman/5.1/en/apis-php-mysqli.persistconns.html

              Cheers, Johan.

              I hadn't realised the i stood for improved and just assumed it was for a different database type. D'oh.

                Write a Reply...