on what kind of query is being run, are there joins in the query, what are the field types etc...
one trick is for table where the data doesn;t change much, like a products table, you can convert them to a text file and have the server read the text file as opposed to hitting the DB...also limit the amount of dynamic content to certain pages, etc
also consider indexing the table to make any searches quicker...
if you find that you are running multiple queries to get info out of several tables at once and you find that nested queries might be quicker, and since mysql does not support nested queries, then it might be time to switch to a db that supports nested queries (DB2, ORacle, Sql Server, PostgreSQL)...
access sucks, though i have seen sites where access can support 20k+ visitors a day....it all depends on code design, like not opening the db connection until you have to, and closing it immediately afterwards...
accessing a db is okay for web if the time taken is 100 millesecs. all dbs are supposed to support this limit...can't remember where i read that number but i know it was recent...
code design and site design are the two main determinants in how fast a site is and how smooth the flow is...
hth