Hmm, sounds a bit tricky. My first question to them is "so you want the GP (general public) to view the same pages, only with the sensitive stuff left out, or do you want some pages restricted to only on campus viewing?" I have a feeling from your post that it's the latter, but I'm not exactly sure....
But you say, "it's gonna be tough?" What's so tough about 1300 computers being checked? One computer can scan 1300 IP's in a matter of milliseconds, can't it?
Are they all publicly routable addresses? It kinda sounds that way "1300 internet connected computers" doesn't exactly tell me. Even if they are, the University's IP allotment should be fairly contiguous. Or, are they private class C's that are attached via routers? Or .... whatever. First find out about the network's topography, and then you can make what might be a better decision.
Say they had 10 class C's in the 192.168 private namespace. Parse $REMOTE_ADDR and use strlen() or somesuch to grab the first three quads ... if it's not 192.168.0.xxx, 192.168.1.xxx, .........192.168.10.xxx, it's alien.
Or, what if the addys' are all routable? Likely that they have nothing more than a B class, so anything that's not from XXX.YYY.(nnn.nnn) is not wanted, right?
What if the buildings each have a publicly routable gateway --- the rest of the boxen are hiding behind NAT. Well, then you just check a list of one IP for each building. Surely there's not thirteen hundred of them.
Hmm, maybe establish a session variable based on the remote IP and don't bother checking for every page as long as you check for the session var?
And, perhaps there are other strategies. Name based virthosting on a firewalled interface, for example.
You can do it!! 🙂