You can also use a .htaccess file to control access to different directories, etc.; the server will use that to decide whether requests should be satisfied or not.
frikikip wrote:Some site rippers however, don't use the robot.txt to check what they are allowed to do.
Of course, a site ripper (or other spider, or anyone who doesn't have a valid username and password) wouldn't have a username or password, so they wouldn't see the protected content anyway - that's the whole point of having a username and password in the first place, after all.