OK, what am I doing and why?
I have a page that has multiple listing. Results are usually presented in a list-type format, divided by number off results etc. Each line is a link, when clicked it displays detailed results.
The results also divided by category etc.
Up until now neither I could spider these results with my search engine nor the outside engines could. So what I’ve done, I put a plain vanilla page where I output all results on the same page, do it similarly to the page that has only links to detailed results. So what I end up having is a long page of links to detailed results.
The page is not intended for users and is not linked from anywhere on my site, just sitting by itself. I direct my spider to it, and since it's in a directory, where other spiders come in I assume it'll be picked up as well.
Now, on my side, after I spider the page with links I kill it in my spidered results, so it won't come in my site's search results, however I can not do the same on different engines, and although the probability that this page will ever surface somewhere in search results on Google or Yahoo! is extremely low, I'd like to have that redirect that will forward visitor to the page actually designed for mass consumption.
I like the JavaScript version, which I put after the loop. I assume it gives the page time to output all links and once it's done, make a redirect. It happens pretty fast visually.
The question I have, would the spiders have time t spider all links before they get redirected?