There are 'webstress' tools which can request a number of URLs using many concurrent connections, and most tools also have the ability to enter changing numbers into the URL, to prevent any kind of server-side caching from influencing the result.
Using those tools you can effectively simulate a large number of users browsing your site. The tools then measure how long it took to fetch the requested page from your server, and you can using system-tools like 'top' or 'sar' to monitor the system's performance and load during the test.
One sidenote though: there is a limit to how many requests you can handle using one network interface. Using tools such as 'webstress' (aptly named after it's purpose) you'll probably find that the server is never able to hand out more tnan 250k-300k page per hour, even if the page is completely blank. This is not a server-speed problem, it's simply the limit of what you can do using one test-pc.