Comment on xkcd #2867: DateTime
phoneymouse@lemmy.world 11 months ago64 bits is already enough not to overflow for 292 billion years. That’s longer than the anticipated age of the universe.
Comment on xkcd #2867: DateTime
phoneymouse@lemmy.world 11 months ago64 bits is already enough not to overflow for 292 billion years. That’s longer than the anticipated age of the universe.
nybble41@programming.dev 11 months ago
If you want one-second resolution, sure. If you want nanoseconds a 64-bit signed integer only gets you 292 years. With 128-bit integers you can get a range of over 5 billion years at zeptosecond (10^-21 second) resolution, which should be good enough for anyone. Because who doesn’t need to precisely distinguish times one zeptosecond apart five billion years from now‽
Hamartiogonic@sopuli.xyz 11 months ago
If you run a realistic physical simulation of a star, and you include every subatomic particle in it, you’re going to have to use very small time increments. Computers can’t handle anywhere near that many particles yet, but by mark my words, physicists of the future are going want to run this simulation as soon as we have the computer to do it. Also, the simulation should predict events billions of years ahead, so you may need to build a new time tracking system to handle that.
nybble41@programming.dev 11 months ago
Good point. You’d need at least 215 bits to represent all measurably distinct times (in multiples of the Planck time, approximately 10^-43 seconds) out to the projected heat death of the universe at 100 trillion (10^14) years. That should be sufficient for even the most detailed and lengthy simulation.