I'm working on a project where we use SVN for our source code control. When changes are committed to the repository, a process on the svn server takes care to copy the changed files to our development server.
Our production server, however, has no such automatic connection -- and for good reason. We automatically deploy committed code changes to the dev server for the purpose of testing. by the time anything gets committed to the repo, we have typically tested it on our local workstations. We don't want to auto-deploy repo changes to our live server for obvious reasons -- something could easily break.
The question I have is once we are convinced that the repo changes need to be rolled out to the production server, what is the best way?
I have suggested running svn export on the production server when we are ready but my coworker is balking -- says it makes him too nervous. Furthermore, svn export will fetch the entire repo unless I explicitly limit the command to some subdirectory or individual file. Additionally svn export might alter file permissions, preventing apache from writing certain data files and/or source files and/or configuration files. There's also the possibility that one might check one's local configuration file into the repo and this would ultimately end up in dev/test/sandbox credentials finding their way onto a production server.
Can anyone recommend a sound approach to the workstation-dev server-production server setup when using svn? At the moment, we are manually tracking changed files and uploading them to the production server via FTP. This is a chore and prone to human error.