Your best bet is to have your web server measure the time taken for each request.
Then, in your development environment, running the server on a different box from the client, hit one page at once on an otherwise idle server, and repeat. That way you can get a fairly accurate idea of the time taken to generate a given page, or run a given operation.
It is important, when measuring this, that you
- Do not run client and server on the same machine
- Don't have anything else happening on the server
- Do ensure that your application is already cached in memory etc - repeat the requests and take the lowest value.
You can't make any sense out of "70% utilisation", it is a totally nonsensical statistic. In reality a CPU is always either 100% or 0% utilised. The metric you should care about is page generation time on an otherwise idle system.
Mark