I am redeveloping a site which has around 3000 pages and I want to make sure that the site is both easy to manage and has efficient code.

The easiest way from a site management point of view is to have a single include() file which contains the all the custom functions needed for the whole site, even though many pages will make use of only a few functions.

Does a custom function get parsed by the server even if the function is not called?

In other words, is there a disadvantage to having a single include() file that contains a lot of code which is only used on a few pages? ie. would it have a noticeable effect on performance?

    I make RPG sites and on everypage I include the header and footer and In the header, it calls a file called common.php. This script has all the special functions and things that I need for the site, and no it does not seem to slow down the preformance of the site. Though I am sure if you have a lot of code, then you might see a difference. I hope this helps.

     Napster

      Thanks, that's helpful to know.

      However, I am interested to know how the process works, so here's a question:

      If you have a piece of PHP code which is executed, presumably this will consume some resources on the server and there will be a small amount of time for the code to execute.

      If, however the same piece of code is encapsulated in a function but there is no call to that function, does PHP still examine the code and process it in any way, or does it just ignore the code unless the function is actually called?

        Even if it is not used the parser will still check over the code and processes it like any other code. So the answer to your question is yes.

          I split my includes up into typically loads of files, for structural / maintenance purposes.

          Yes, PHP reads almost all of them on every single hit, parsing in loads of functions which don't get called.

          It does add some work for the server - but not very much.

          If it becomes a problem anyway, you can investigate one of the various PHP caching things (Turck MMcache, Zend accelerator etc) which will cut down on the startup time by caching already compiled pages in memory (along with their includes).

          Oh yes, and use require() not include()

          Mark

            One thing I would stress as sites get bigger is the seperation of design elements and logic elements. You are obviously already taking this into account by the fact you have a seperate functions file but there is generally specific page logic for each page. On the site I work on we have a relatively complex structure (not quite suitable for scaling up as far as it has been but very good for mid-sized, mid-traffic sites). Basically you have your, let's say, index.php file which sets a few variables and then includes a mediator script which handles including the relevant function, page logic, page design, header, navigation, banner, side ad and footer elements. This makes for a very flexible sollution, however, I would be inclined to check out the available frameworks before building your own from scratch. Check out the section in PHPKitchen on this subject, at the very least it will give you some ideas.

            HTH
            Bubble

              at runtime php complies the script into binary code. but still, since its a scripting language, it still interprets most of it afterwards when it executes it.

              for example, this works:

              <?php
              
              // call foo before we define it
              foo(); 
              
              // unconditionally define foo()
              function foo() { 
                  echo 'i am foo';
              }
              
              ?>
              

              but this will not work, because of the conditional definition, even if the cond is always true

              <?php
              
              $bar = true;
              
              foo();
              
              if ($bar) {
                  function foo() {
                      echo 'i am foo';
                  }
              }
              ?>
              

              you can see that php compiles the entire script before trying to interpret it. in the first case, php knew the definition of foo() before it interpreted the script, and so when it came to interpret the call to foo(), it had no problem, and it worked. but in the second case, php had not yet evaluated/interpreted any conditionals, so the call to foo() resulted in a fatal error because it interpreted the call to foo before it got to the function definition.

              i havent tested it, but i would also imagine when using conditonal includes and evals, that they are not compiled into binary code until the parser has already reached the interpretation stage and realized it must do that. afterall, it might not even be possible to know the name of the file to include until interpretation time.

              although i havent looked at how the different php accelerators work, i would imagine they can only cache part of the script(maybe just the part that php initially compliles into binary before interpretation). if thats the case, then there could be significant differences in performance when dealing w/ conditional includes and the like. having them conditional would prob be a good idea if the code to be included is rarely used, but if commonly used, a cond include might not be able to be cached by the accelerator. again, im just guessing here.

              just wanted to give you a tiny bit of insight into how php works. hope you find it helpfull in some way.

                Thanks all, for your help and suggestions

                  Write a Reply...