I want to issue a token for users on my website which will be linked to a sessionID, this will be saved in a database and used later.
What I want to avoid is tokens being issued to robots which crawl my website, how can I do this?
I think via a text file I can remove robots but I don't want to do this, just not issue a token.
I thought if there was an IP range for Google, Yahoo, Bing etc I could exclude them but I'm sure there are hundreds of robots out there?
Any ideas?