Milky Way Time In “Is Time an Illusion?” Craig Callender discusses the difficulty of telling if two events are simultaneous or not and thus of establishing a universal, standard measure of time. This argument always seems unconvincing to me. We know how fast our galaxy is rotating, we know our sun’s position and velocity, and we know Earth’s position and velocity. It seems to me that we could easily define a “Milky Way Standard Time” much as was done when we agreed on Greenwich Mean Time way back in the late 1800s, which made it easy to decide what time it was in California when something happened at a certain time in Chicago. By the same token, but with more to calculate than just a difference in longitude, it should be possible to compute the Milky Way Standard Time when two events occurred and determine if they were really simultaneous or not. Does this make the problem go away?
When discussing past-to-future slicing of spacetime, Callender also writes that “the data you need … are fairly easy to obtain. For instance, you measure the velocities of all particles.” But Heisenberg’s uncertainty principle puts definite limits on how accurately one can measure the position and velocity (or momentum) of a particle. It is a very important limitation, and it seems to me that the entire argument falls apart at this point.