timestamps are nothing more than numbers, they integers that represent a specifc point in time. since numbers are infinite, as are the range of dates that timestamps can theoretically represent. the problem here is not with the timetsamp system itself, it is with how big of a number your OS can deal with. if you use a 32bit OS then there is a finite sized integer that your system can handle and thusly a finite range if dates that be represented via timestamps. if you use a 64bit OS, 128bit, etc.. then there are a greater range of dates that you can use (but still finite).
the bigger picture here is that different systems have different proprietary ways of storing temporal data. let's say we have two date/times from two different computer systems:
- April 18th, 2005 at 12:12pm EDT
- 2005-04-01 22:14:26
these two dates are just strings and would have to be broken down into pieces to be effectlivey compared/added/subtracted. this gets cery complicated when dealing with lots of different (and incompatible) date formats. this is where timestamps come in. again, timestamps are just numbers. numbers are very easy for computers (any computer ever made) to campare/add/subtract and thus they make an ideal way to represent dates/times.
make sense?