I can't even begin to tell you how to pull it off, but I will give you the perfomrance trade offs for each:
1)This is acceptable with a few users searching alot of data or alot of people searching a little data. When you have alot of people searcghing alot of data, the server will crawl.
2) it would probably be better just to index it with a normal index, because you would have to much redundant info in the database and you would have more tables to manage and update.
3)Maybe creat a database that just includes keywords for each of the pages.. this one sucks because you have to think of keywords for each of the pages and if the suer searches a keyword that you didn't think of that does match your page they will not see that page in the results. You could also do a file text searching.... but thaT HAS A huge OVERHEAD.
the best way would be to create your database containg all the page text of each page and then create a fulltext index on the page title, contents and author (if you want to include that feild).
I know this isn't a great help, but it is a little help. (at least I hope!)
-mark