This depends on sooooooooo many thangs:
The size of the page being generated.
Whether or not that page is being delivered from database data, and if so, how much, since DB queries add overhead.
How much of the content is truly dynamic. It's best to keep static shtuff static if you have large chunks of content which are identical on each page.
How fat the network pipe is. Just because I can crank out a page in .060 seconds on my development box, where I'm connecting on a 100MB ethernet with negligible traffic, doesn't mean that my live server will do the same when I'm connected with a 56k dialup...
How busy the server is.
How busy the database server is, if you're delivering database dependent content. Also, whether or not that database server is running on the same physical hardware as the web server. (It's often, but not always, beneficial to run the DB server on seperate hardware...)
How many include files are used. each include() adds a disk hit. Disk access times, I've found, add a large degree of un-predictability, to dynamic page generation times.
Whether or not you're using a caching mechanism for your PHP scripts. Basically, caching tools help the server keep your scripts functions in memory, and help to minimize disk hits when include files are reused....
The amount of memory granted to the server process. Same for database server, if applicable...
Processor speed.
Disk drive and controller speeds.
Since there are so many variables, it's nearly impossible to provide a number. But I do recommend:
Testing in a controlled environment, where there's minimal (or at least consistent number of) system processes occuring. *
Comparing the load times of the same page in both static and dynamic formats. *
Use a timer to evaluate how long each portion of your script is executing. From there, you can tweak portions of your application if you think one component isn't performing optimally. *
Have fun...