Wget is good for grabbing static copies of sites off of the internet. It will give you a static copy of your site, so none of the functionality will work (but navigation and content will still work)
wget -m -np -p -E -k -w 2 <url>
where -m = Turn on options suitable for mirroring
-np = Do not ever ascend to the parent directory
when retrieving recursively
-p = This option causes Wget to download all the
files that are necessary to properly display
a given HTML page
-E = This option appends .html suffix on all
files of type text/html
-k = After the download is complete, convert the
links in the document to make them suitable
for local viewing.
-w 2 = Wait 2 seconds between the retrievals to
lighten the server load.
The only post-processing you'll need is some sort of script to go through and convert all your ?'s in your page URI to something that can be handled by Windows (e.g.: underscores)
I use a command-line perl script on the webserver to do the trick.
ONE MAJOR CAVEAT: you must have your URL's in the "example.php?var1=x&var2=y" format instead of the SEO-friendly "example.php/var1/x/var2/y" as the SEO-friendly format will cause lots of sub-directories to be created and all your relative links will be broken. I just have my code look for the wget user-agent and doesn't use SEO-links.