Search engines don't necessarily assume that a "404" error (or any other type of error) is permanent.
An intelligent spider will probably schedule any not found URL for another attempt later, in case it's a temporary problem (e.g. site has been closed for maintenance, replaced by a holding page).
Those pages won't be removed from the index either, unless the spider gets 404s repeatedly over several attempts. At some point, the page will be de-listed. Eventually, the spider will stop trying to fetch that page, and remove it from its list.
Not all spiders are that clever though. The ones I have the most problems with are evil / lame ones, which don't usually identify themselves as robots anyway. Google, MSN, Yahoo, etc, don't usually give problems.
Mark