"Screenshot of a site" is a bit of a category error. A screenshot of a page is a P.o.P. these days as most modern browsers will let you save an entire page with images and all. Making a snapshot of an entire site is another matter, which wget was written to do. As for "optimise" a site as it's saved - what would that entail? It would be more straightforward to save the site and then run whatever optimisations are intended over the resulting files.
GNU Wget is a freely available network utility to retrieve files from the World Wide Web, using HTTP
(Hyper Text Transfer Protocol) and FTP (File Transfer Protocol), the two most widely used Internet proto-
cols. It has many useful features to make downloading easier, some of them being:
o Wget is non-interactive, meaning that it can work in the background, while the user is not logged on.
This allows you to start a retrieval and disconnect from the system, letting Wget finish the work.
By contrast, most of the Web browsers require constant user's presence, which can be a great hin-
drance when transferring a lot of data.
o Wget is capable of descending recursively through the structure of HTML documents and FTP directory
trees, making a local copy of the directory hierarchy similar to the one on the remote server. This
feature can be used to mirror archives and home pages, or traverse the web in search of data, like a
WWW robot. In that spirit, Wget understands the "norobots" convention.
o File name wildcard matching and recursive mirroring of directories are available when retrieving via
FTP. Wget can read the time-stamp information given by both HTTP and FTP servers, and store it
locally. Thus Wget can see if the remote file has changed since last retrieval, and automatically
retrieve the new version if it has. This makes Wget suitable for mirroring of FTP sites, as well as
home pages.
o Wget works exceedingly well on slow or unstable connections, retrying the document until it is fully
retrieved, or until a user-specified retry count is surpassed. It will try to resume the download
from the point of interruption, using "REST" with FTP and "Range" with HTTP servers that support
them.
o By default, Wget supports proxy servers, which can lighten the network load, speed up retrieval and
provide access behind firewalls. However, if you are behind a firewall that requires that you use a
socks style gateway, you can get the socks library and build Wget with support for socks. Wget also
supports the passive FTP downloading as an option.
o Builtin features offer mechanisms to tune which links you wish to follow.
o The retrieval is conveniently traced with printing dots, each dot representing a fixed amount of data
received (1KB by default). These representations can be customized to your preferences.
o Most of the features are fully configurable, either through command line options, or via the initial-
ization file .wgetrc. Wget allows you to define global startup files (/usr/local/etc/wgetrc by
default) for site settings.
o Finally, GNU Wget is free software. This means that everyone may use it, redistribute it and/or mod-
ify it under the terms of the GNU General Public License, as published by the Free Software Founda-
tion.