Merve: I have been developing without using HTTPS, so that is not the problem.
swr: you are definately on the right track as far as caching goes. What I am trying to do is get the browser to not bother even contacting the server, just to take the most recently [locally] cached version of the page and display it back to the user.
I have both seen this done before, and actually done it before, though for the life of me I can't figure out where or how. The end result is that when you hit the back button, not only do you see the last page, but all of the form values that were submitted re-appear (radio buttons & checkboxes, textboxes, etc., are populated with what was sent to the server).
The application was designed with attention to the possibility of re-submitting the same data. The data is simply stored in the database and there are no side effects. In other words, hitting submit 3 times really quick wouldn't make a difference with respect to the data in the database.
===
I think it works now. Whether this is the fix, or it was just my browser that was malfunctioning, this is the winning combination:
$cachetime = 10800;
$gm_expires = gmdate('D, d M Y H:i:s', time() + $cachetime);
header("Last-Modified: ".gmdate("D, d M Y H:i:s \G\M\T", time()));
header("Cache-control: public, max-age=$cachetime, s-maxage=0, must-revalidate, pre-check=$cachetime");
header("Expires: $gm_expires GMT");
I scrapped everything and went back to the HTTP/1.1 RFC. I gathered from there that this was a good way to go, and it works perfectly on my web browser at home (although I suspect my work computer is still not working).
It seems to make sense... The page was last modified the instant it is generated. It has public cache control where clients expire after 3 hours, proxy caches expire immediately. For redundancy, the expires header defines the exact time at which the browser should discard the copy.
Hrmph. Any ideas what might be wrong with my browser(s)? Internet Explorer 6.0 SP-1 and/or NetCaptor
Scott