If I recall correctly, robots.txt is used to either exclude bots from accessing certain files or to tell the bot where to find a sitemap. .htaccess is something entirely different. It tells apache how to serve up pages. for the most part, bots and users should get the same behavior from an htaccess file. users don't give a hoot about the robots.txt. robots care a lot about robots.txt because it tells them which files to ignore and where it might find a sitemap.
a sitemap can be useful for getting pages into a search engine that are not somehow linked from your home page.