If you're using postgres, there are two thing you need to know about. One, 7.0.2 is about a zillion times faster at complex or large queries than 6.5.3 was, and two, pg_dump is a way cool utility.
Let's say your database weighs in at something small enough you can afford to back it up completely, then you can use:
pg_dump database_name >file.dmp
to backup a database and
createdb database2_name
psql -e database2_name <file.dmp
to load it into another database.
So, with a script file, you could run
pg_dump database_name >backup-000825.dmp
with the date set for each day so you have daily backups.
Note that these files tend to be easily compressed, so bzip or gzip can shrink them down by 90% or more sometimes.
Performance wise, 5,000 queries is nothing, unless they are each fairly large. I've tested update code on tables from 10,000 rows up to a couple million and the performance was fast enough to finish almost any set under 500,000 in a few hours.
Lastly, one big complex query is preferred to running the same simple query over and over in a loop. If you aren't familiar with sub selects, I highly suggest you read up on them in the postgres online manual, they're much faster than a for next loop.