I need some advice. After looking at the many caching options (which are somewhat over my head programming wise) I am hoping there is a much simpler way to flatten a dynamic section of my site and then make that section a simple HTML include page. I have a "latest posts" page for my forum that many of my users have bookmarked as their site entry point. Unfortunately this page is very db intensive and it would be ideal to flatten it and just refresh it every 5-10 minutes. My site gets about 25,000 page views daily and I am betting this page alone is 3-5K of that.

Any ideas on this would be apprciated. Chrons are no problem.

Thanks,
Jim

    Actually I'd suggest that you build that page every time a new post is made. That way it's always up to date for you users. That is if most of your traffic is read-only. If you have very active postings then this might not fix your problem.

    If you need to just adjust your page to store all of it's output into a variable and then write that variable out to a file named what your current script is named and set up the cron job to run when you need it to.

      I would have to say I have "very" active postings. Very often I will make a post to the site and within minutes there are several newer posts to the one I made. I realise there is a tradeoff to doing this every 5 mins or so (less up-to-date info). But this page is really suppose to provide a snapshot of recent postings, not be constantly hit on every minute to see if someone responded to a specific post (as I think some members are doing, because hey...I do it). 🙂

      I need specifics on this also.... That is the trouble I have found with people talking about caching options. It's all general, no specific examples or step1-2-3 tutorials.

      Thanks,
      Jim

        Unfortunatly there are not specifics or tutorials that I've ever found. You just have to wing it.

        But as I stated above, if you re-design the page you currently have to store all of it's output (you are coding your entier page in PHP and not embedding HTML aren't you?) into a variable, then you can just write this variable out to a file. Once the code does this then you can set up a cron job to run the script every 10 minutes and you are done.

        <?php
           $html = '';
        
           // Code to store everything this page does into $html
        
          $fp = fopen('latest_posts.html', 'w');
          fwrite($fp, $html);
          fclose($fp);
        ?>
        

          You can use ob_start() and ob_get_contents() to caputure your whole page and the save it to a file.
          as eoghain explained.

          If you set a flag when a new post was made you could recreate the page and reset the flag.

          HalfaBee

          <?
          // get flag from DB
          if( !$new_post_flag )
          {
             include( 'latest_posts.html' );
             exit();
          }
          
          ob_start();
          
          // all of your page here
          
          $html = ob_get_contents();
            $fp = fopen('latest_posts.html', 'w');
            fwrite($fp, $html);
            fclose($fp);
          
          $new_post_flag = FALSE;
          // save it to your DB
          ob_end_flush(); // show the page
          ?>
          

            That is nice HalfaBee. I haven't worked with [man]ob_start/man before but that makes the whole script nice and clean. Plus it will re-fresh itself when needed and not on a timer. This could however make the page still be dynamic if there is enough traffic on the postings.

              Thanks egohain, ob_start() is very useful in many varied cases, worth having a look at.

              HalfaBee

                I was able to find a simple CGI script that grabs any posted web page. Then I just set up a page result for the posts and it was done. Of course I don't know if this is worse or better than a pure PHP solution but it works. 🙂

                http://hammer.prohosting.com/~runlinux/headlines.shtml

                Then this little snip helped with the include virtual issues...

                $curDir = dirname( getenv( "SCRIPT_FILENAME" ) );
                virtual ( "cgi-bin/linuxreview.cgi" );
                chdir( $curDir );

                Thanks for all the input though. I may come back to this at some point and attempt to cache more pages across the site.

                Cheers,
                Jim

                  Write a Reply...