This is a request for insight as to potential performance/scaling problems that may arise by building content in the following manner:
FILE = Page.php
<?php
class Page {
function draw($param) {
echo '
<html>
<head>
<title>'.(isset($param['title']) ? $param['title'] : 'Some default title').'</title>
</head>
<body>
<!-- Body consists of all the static html seen on every page -->
'.(isset($param['content']) ? $param['content'] : ' ').'
</body>
</html>
';
}
}
?>
FILE = index.php
<?php
require_once 'Page.php';
/* Functions & Processing to build page content here */
$param['title'] = 'Welcome to blah blah blah';
$param['content'] = get_some_info(); // Returns formated HTML string
$param['content'] .= get_some_other_info(); // Returns formated HTML string
Page::draw($param);
?>
...those two examples are extremely simplified versions of what is in production but represents the core of how pages are created.
Does anyone see any potential problems down the road as far as performance or scale. Using this method, page creating doesn't get much easier and it allows for great control over many things. However,
I am worried that having every page echo out the static content along with the dynamic content will create memory issues should concurrency rise as more users hit our site.
I have used this exact method for some heavily used internal (Intranet) websites without problems (an understood potentially insignificant fact).
Our initial expectations are very low. If we can generate 2000 unique visitors a month we will be happy but in a years time we will be looking for exponential growth. A little planning for that now can save us big later.