I been browsing around the web on Ajax tutorials and the like. And I've been left with a lingering question:
Will top search engines find my site if it's 100% Ajax?

I work for a web company, and we must take into account that everything we make must encourage search engines to crawl our websites. Judging by the way Ajax websites work, I'd find it hard to believe that a content rich website would allow for search engines to crawl through their pages if it is delievered through the Ajax technique.

Personally, I'd only consider using Ajax for the backend of my future projects. As I am the lead developer for our in-house CMS, I am having doubts about pushing Ajax into our front end. So, if anyone has any input, that'd be great 🙂

    Yes and no. Directly, no. If you do some careful planning (site map) and build the machine so that it can handle urls accordingly you are good.

    Ajax is not typically used to drive an entire site in my opinion...it should be used to do some of the smaller tasks.

      Spiders don't like ANY kind of sites that have 100% dynamic urls. You'll notice at the bottom of these forums is an archive link. That is a static html archive specifically designed to make the forum more SE friendly, and increase rankings. I can tell you that it works. The rankings of all our forums increased when we upgraded to the version that contained it.

        Well, after some further investigation, and talking in a couple Ajax IRC channels, it is confirmed that an Ajax pure front end would be a bad idea. I am still going to continue our development of a complete Ajax back end, I believe it's worth the blood sweat and tears, plus lack of sleep, and keep our front end more of the classical 'web site'.

        Personally I feel Ajax would be an ideal solution for a complete website, not just web apps. Just imagine how much more attractive all of your own personal websites could be (I just imagine everything I'd never expected from a 'normal' dynamic website.) Of course an engine must be there and geared with the thought that it will be interacting with Ajax. Though, Ajax is after all, just a means of implementing a user friendly, modern geared, UI for clients/users. So I don't feel that Ajax should only be used for small little gizmo's I'd want to 'spice up' on a CMS. If I wanted to settle for that I wouldn't even consider Ajax, I could just stick to few little JavaScript functions to accomplish that.

        I do also understand how creating SE friendly URI's to pass variables is a must to ensure ranking and crawlablity. SEO is a large part of our business, and is why client's come to us. Thanks for the tip on the static HTML archive, I've considered implementing something of the like as an optional module in the past, but we still havn't ran into a situation where it would be neccessary, but I'm sure we could get creative with a simliar feature.

        Thanks for the input. Happy coding.

          I think that where AJAX comes into its own, is not so much the ability to "spice up" the look of web pages, but make them more user aware. Your modern Word processor such as word or OO works with the user and works in the back ground. Auto saving, spell checking and the use of a fully even driven interface.

          The sites I have seen that use AJAX have not used it for everything. One particular site, which I was unaware that it used it, auto saves what you are typing in a textarea. I was surprised to see that the next time I returned to the site, that the text I had entered (but not submitted) had been pre filled. Google's use of Ajax is also impressive in its GMail UI and Google Maps.

          The important thing to remember in my opinion that developing an AJAX only front end will always be a bad idea, especially if your site serves the general public. Some of which have special requirements and need to use software such as text readers to view sites. If you are going to have an AJAX front end, take the time to have a standard static front end too for those who do not whish to or cannot use JavaScript.

          Another thing I have heard is that users are so used to seeing blanking pages, that when they click a link and only part of the page content changes, they simply do not notice. 🙂

            JPnyc wrote:

            Spiders don't like ANY kind of sites that have 100% dynamic urls.

            That is absolutely untrue and one of the most outdated modern myths of all. There are reasons the archive pages index well, but because they are not "dynamic" is not one of them.

            On topic, any front-end AJAX stuff, or forget AJAX any front end javascript should be degradable, for accessibility. That then it is also accessible to spiders is a side effect. I think going about it thinking about designing for spiders is not the best way. Design it well for users (accessibility, etc), and it will (should be) be good for spiders.

              The best indexed websites are websites with CONTENT. If you have content the spiders will love you. Like create a new article everyday with a good title for the page title like the actual name of the article and not your business. Make thearticle relevant to your webpage.

              Also get a good PR, page rank. Meaning the more sites that link to you the higher your rank will be.

              Content, content, content is the key. Also make sure that spiders can find ALL of your content, not just the newest.

              GG Search Engine Optimization!

              --FrosT

                Write a Reply...