Help. I have a dynamic website with thousands of internal links. When people use automated means to crawl the site (sometimes requesting dozens of pages per second, on top of an already significant regular traffic) my server grinds to a halt.
I have an .htaccess file that bans the usual suspects (Internet Ninja, etc) but that doesn't seem to be enough. I try to keep my eye on the weblogs (live), and ban IP's as soon as I see one getting greedy, but that solution is obviously less than ideal. So I need a way to monitor my site for this kind of activity and shut it down. Unfortunately, becuase of the number of hits we get from legitimate visitors, I don't want the db burden of storing every page request by every visitor, then searching the entire thing for the number of recent matches before loading a page. Seems like that would be a big burden on my MySQL server, which already runs at 25% of CPU during peak times.
Your thoughts on what to do about this problem would be greatly appreciated.
Thanks!