I've done a little bit of research on how search engines works.
What I have found out is that it is important to have good meta tags. The content of the page(s) must be stright on point - highlighting website features etc. And when registering the website in search engines, it's important to write down key elements of the page.

BUT - there are different types of search engines working differently. And if I'm not mistaken, they will have a hard time indexing dynamic pages.

So my question is:
I'm developing my pages in PHP and some information is retrieved from DB and text/data files. This information is fetched on runtime. So HOW kan I make my page be detected by web crawlers, search engines etc? (note! I HAVE submitted URL to search engines)

Cheers!

    This, taken from Google's Fact vs. Fiction page on their site:

    Fiction
    : Sites are not included in Google's index if they use ASP (or some other non-html file-type.)
    Fact: At Google, we are able to index most types of pages and files with very few exceptions. File types we are able to index include: pdf, asp, jsp, hdml, shtml, xml, cfm, doc, xls, ppt, rtf, wks, lwp, wri.

    While they don't specifically mention php, I know my php pages are included in their search engine. My advice is to create your pages as you normally would, perhaps visit google's Webmaster Guidelines page for helpful tips, and not fret. 🙂

    hth
    -Elizabeth

      Sweet! That's a relief 😃

      The more I read, the more I see that META tags are more or less useless now. At least for the bigger search engines anyway.

      And registering websites is tedious work!!!!
      And it takes up a lot of time :-(
      I'm in the starting phase of establishing my own company.
      And a part of it is webdesign.

      So I'm thinking of developing some PHP pages that automatically signs URLs up on different search engines.
      Don't have a clue how to go about this problem though.
      Has anyone done something similar?

      I know there are some webpages that already has that function - but then I have to put a link to their page etc. etc. Nah, gonna try and make my own list that automatically updates 20-30 search engines.

      See ya late'a.

        4 months later

        I wouldn't write the submission script. Most make you enter a security code based on a graphic to prevent automatic submissions.

        Hand submit to the majors and DMOZ, the rest will find you.

        Use something like my Botspotter script to see which are actually visiting.

        Sarah

          It's also a good idea to craft your markup to utilize a proper structure vs. presentation separation (XHTML + CSS), and use good semantic markup. Make your pages accessible to those with visual impairments--remember, Google is the biggest blind user on the internet.

            Originally posted by goldbug
            Google is the biggest blind user on the internet.

            Not as blind as some other search engines IMO! Like google can see if you try hiding keywords on your page by making them the same colour as your BG and probably a bunch of other things too!

              Another good trick is http://phpbuilder.com/columns/tim19990117.php3

              It's not as relevent now, but it's still a nice thing in certain situations. I have about 70,000 pages on one site using that and it appears that each page is static. Another thing it does is make it look like you have directories dedicate to certain topics, if used correctly. This helps up your ranking.

                I implemented a site based on Tim's tutorial and while it isn't so relevant I like it because

                • can block an affiliate's path in robot.txt stopping me from appearing to have duplicate content
                • gives a nicer user experience when I email links.

                Sarah

                  Write a Reply...