Yeah. Clocks/calendars are one of those things that are deceptive. On the surface, they seem simple, because we (as humans) deal with them in both large enough chunks (fairly large fractions of a second), and small enough chunks (a few years) at any point in time, that most of the idiosyncrasies can be glossed over. For example, unless you hear about it in the news, you'll never notice the leap-seconds that get added periodically.
On the other hand, computers work in units of time small enough that errors can accumulate rather rapidly, *and* have to handle spans of time long enough that you're talking *centuries*, minimum.
For example:
Everybody knows that there's a leap year every 4 years.
*Some* people know that there is no leap year every 100 years.
*Few* people know that there is still a leap year every 400 years.
Virtually nobody knows the further set of exceptions beyond that, and that's just one example of calendar-level rules that you have to take into account when dealing with time.
[doublepost=1460580275][/doublepost]
That's actually only an issue in 32-bit Unix flavors. (Or possibly some weird 64-bit flavors that for some reason kept time_t defined as a 32-bit integer.)