We are in the process if building a smalls script for our webserver that allows us to log into an admin area and view basic stats about the site etc, we wanted to add a spider blocker to the system in PHP, but after looking around for a while we havnt found a method of doing this. we would like to do:
View Spiders [user agents] who are hitting the site, then offern an option to block or deny this spider to the site, for example I know alot of people use web reaper programs that ignore the robots.txt file and grab your whole site.
I know you can do a rewrite condition and redirect the spider in a .htaccess file.
any thoughts on how this could be done from a admin web page, and block known spiders depending on user imput or point me to some code snipts or functions that might help?
sTeve