Howza? Do you mean there might be some point in the distant future when everyone will say, "No, tell you what, we won't count seconds from 1970-01-01 00:00:00 GMT to specify date and time."?
What timescale are we talking here? Five years? Ten? Thirty? Five million?
The definition as it stands is good until 2038, and the only reason it would fail then is because after that the timestamp would no longer fit into 32 bits - but by then 32 bit processors would be as obsolete as 4-bit processors are now. 64-bit timestamps would outlast the solar system, and the galaxy would probably have evaporated as well (unless the Universe collapses first).
Considering that computers are still running the same underlying infrastructure that they ran forty-fifty years ago, it seems unlikely that such a massive change as adopting a new epoch would come in a similar timeframe in the future either.