When I first saw the news I was thinking “there’s no way atoms vibrate differently on the moon” but you’re right it’s about perspective and I’ve realized there’s no way I’m smart enough to handle timezones on an interplanetary scale. I can only hope that the difference between earth seconds and moon seconds can be expressed as a consistent ratio.
I will gladly use some programming library invented in the basement of a university powered by coffee, and rage.
It’s well understood math, but it’s “only” relativistic orbital mechanics.
It boils down to a pretty consistent number, but how you get there is related to the weight of the moon, how far it is from earth, and how fast it’s going.
Since the moon is going different speeds at different places in it’s orbit, the number actually changes slightly over the month.
They’re just using the average though, since it makes life far easier. We use the average for earth too, since clocks move differently at different altitudes or distances from the equator.
It’s not too bad. Relativity says that no frame of reference is special.
-
On earth, a second looks like a second, but a second on the moon looks too quick.
-
On the moon, the second looks like a second, but a second on earth looks too slow.
Both are actually correct. The simplest solution is to declare 1 to be the base reference. In this case, the earth second. Any lunar colonies will just have to accept that their second is slightly longer than they think it should be.
If it helps, the difference is tiny. A second is 6.5x10^-10 seconds longer. This works out to 56 microseconds per 24 hours. It won’t affect much for a long time. About the only thing affected would be a lunar GPS.
Unfortunately, it’s not a useful one. While we know approximately where it is, we don’t know how deep the gravity well is. That gravity well slows the passage of time, just like the earth does. Without an exact mass, and mass density, we can’t calculate the correction factor.