How so?
Here is an example for signed integers.
These represent zero time but have different representations in memory:
Seconds: 2 Nanoseconds: -2,000,000,000 (fits in a 32 bit number) Time: zero seconds
Seconds: -2 Nanoseconds: 2,000,000,000 Time: zero seconds
Here is an example for unsigned: Seconds: 1 Nanoseconds: 0 Time: 1 second
Seconds: 0 Nanoseconds: 1,000,000,000 Time 1 second
How so?