Here's one way to do it -- a quick and dirty version (assuming you run Linux):
1) download and install 'wget' (go to rpmfind.net if you are running Linux, or for win32, search google for a copy of it)
2) something like this:
'wget -r http://www.phpbigot.com' will recursively spider through the site and save a copy of it to your hard drive
3) setup cron to trigger wget every so often
4) get something like htdig and index it. (look for htdig at SourceForge.org)
...that's a real skeletal version, but I hope you get the idea. It really depends on what scale you want for this 'script' -- as in how much you want to index, and how many websites you want to spider
..wait a second. Maybe I'm misreading your whole message -- do you want something that posts keywords to search engines and you parse out the results? If so, I know there's a perl package that does that.