Hello. I'm new to PHP.
I use INCLUDE files to basically frame my page with static elements (like company slogan and contact information) which is identical on every page of my site.
My concern is that I don't want to have bots like Google determine that this is all duplicate content (which it is, if you crawl the same includes on every page) and hence fail/refuse to crawl and index my site.
From the research that I have done, I believe that I can create a directory to contain all of these includes (with redundant text), then I believe I can use a robots.txt file in my websites root directory, to simply disallow access by bots to that particular directory.
The thing that confuses me is: since PHP is pulling these includes in from the server side, I don't know whether the robots.txt file will work, or if PHP will just automatically pull these elements (includes) into the main page and the bots will never realize that the content came from the includes directory, which was "forbidden".
Will the robots.txt file work in this scenario to prevent the content of the includes from being included in the main page and/or crawled by bots? Will this fix my duplicate content problem, in regards to the search engines, or is there another BETTER way of accomplishing this?
Please advise!
Thanks a lot for any assistance!
Jeff