I’ve spent a decent chunk of my career wrestling with time sync — NTP/PTP, GPS, timezones, all that fun stuff. For real world network time infrastructure, where do we actually hit diminishing returns with clock precision? Like, at what point does making clocks more precise stop helping in practice?
Asking partly out of curiosity, I have been toying with a future pet project ideas around portable atomic clocks, just to skip some of the headaches of distributed time sync altogether. Curious how folks who’ve worked on GPS or timing networks think about this.
For network stuff, high security and test/measurement networked systems use precision time protocol [1], which adds hardware timestamps as the packets exit the interface. This can resolve down to a couple nanoseconds for 10G [2], but can get down to picosecond. The "Grandmaster" clock uses GPS/atomic clocks.
For test and measurement, it's used for more boring synchronization of processes/whatever. For high security, with minimal length/tight cable runs, you can detect changes in cable length and latency added by MITM equipment, and synch all the security stuffs in your network.
My understanding is that precise measurement of time is the basis of all other measurements: space, mass, etc. They are all defined by some unit of time. So increasing time precision increases potential precision in other measurements.
Including of course information - often defined by the presence or absence of some alterable within a specific time.
We invent new uses for things once we have them.
A fun thought experiment would be what the world would look like if all clocks were perfectly in sync. I think I'll spend the rest of the day coming with imaginary applications.
They couldn't stay synced. There's a measurable frequency shift from a few cm of height difference after all. Making a pair of clocks that are always perfectly in sync with each other is a major step towards Le Guin's ansible!
For other readers' info, clock stability is crucial for long-term precision measurements, with a "goodness" measured by a system's Allan variance:
https://en.wikipedia.org/wiki/Allan_variance
I guess very few systems have better absolute time than a few microseconds. Those systems are probably exclusively found in HFT and experimental physics.
This past week I tried synchronizing the time of an embedded Linux board with a GPS PPS signal via GPIO. Turns out the kernel interrupt handler already delays the edge by 20 us compared to busy polling the state of the pin. Stuff then gets hard to measure at sub microsecond scales.
Yes, I'm aware of some of these developments. Impressive stuff, just not the level of precision on achieves tinkering for a few days with a basic gnss receiver.
Another commenter mentioned that this is needed for consistently ordering events, to which I'd add:
The consistent ordering of events is important when you're working with more than one system. An un-synchronized clock can handle this fine with a single system, it only matters when you're trying to reconcile events with another system.
This is also a scale problem, when you receive one event per-second a granularity of 1 second may very well be sufficient. If you need to deterministically order 10^9 events across systems consistently you'll want better than nanosecond level precision if you're relying on timestamps for that ordering.
For most applications, clock precision of synchronization isn't really necessary. Timestamps may be used to order events, but what is important is that there is a deterministic order of events, not that the timestamps represent the actual order that the events happened.
In such systems, ntp is inexpensive and sufficient. On networks where ntpd's assumptions hold (symetric and consistent delays), sync within a millisecond is acheivable without much work.
If you need better, PTP can get much better results. A local ntpserver following GPS with a PPS signal can get slightly better results (but without PPS it might well be worse)
They mention a "quantum noise limit", that must be the ultimate precision that is physically possible, right?
What is this ultimate precision? I imagine that at some point, even the most modest relative motion at ordinary velocities would introduce measurable time dilation at fine enough clock precision.
When I was in anthropology, many of the cultures I studied had very vague concepts of time (sunrise/sunset, passage of stars and constellations, different seasons). One of my professors spent two weeks about how time was a Western construct and how people want to go to such great lengths to have such precise measurement of it.
The very lengthy discussion around the concept was fascinating to me as a 23 year old college student who only knew it from one perspective.
Japan had a whole fancy temporal hour system before Western contact. It was more complicated than our modern framework, as it was based on the time between sunrise and sunset and so the length of the hours had to be adjusted about every two weeks. But they certainly thought quite a bit about it, so I'm not sure how it could be claimed to not be a concept there at the time.
How would the variable hours be used? Presumably access to the timekeeping was limited, so who even was aware of the difference so that they could modify their life to accommodate it?
This topic of time being a western construct, it's impact on society and life is one of the subjects in the excellent book "Borderliners" [1] by Peter Høeg. A favorite of mine, though I never read the English translation.
It's a fascinating topic, and impacts our live more than we might be aware of or care to admit. I started thinking about it in a different way after reading this book.
That really doesn't seem to make sense as written. Even if for "Western" you count all the way to the Middle East (where much of our chronometry originates), there's still a lot found in China and the New World. (From what I can tell, India does not seem to have a strong independent record here? Though they certainly borrowed from the inventors, just like Europe did.)
> we can demonstrate quantum-amplified time-reversal spectroscopy on an optical clock transition that achieves directly measured 2.4(7) dB metrological gain and 4.0(8) dB improvement in laser noise sensitivity beyond the standard quantum limit.
I’ve spent a decent chunk of my career wrestling with time sync — NTP/PTP, GPS, timezones, all that fun stuff. For real world network time infrastructure, where do we actually hit diminishing returns with clock precision? Like, at what point does making clocks more precise stop helping in practice?
Asking partly out of curiosity, I have been toying with a future pet project ideas around portable atomic clocks, just to skip some of the headaches of distributed time sync altogether. Curious how folks who’ve worked on GPS or timing networks think about this.
For network stuff, high security and test/measurement networked systems use precision time protocol [1], which adds hardware timestamps as the packets exit the interface. This can resolve down to a couple nanoseconds for 10G [2], but can get down to picosecond. The "Grandmaster" clock uses GPS/atomic clocks.
For test and measurement, it's used for more boring synchronization of processes/whatever. For high security, with minimal length/tight cable runs, you can detect changes in cable length and latency added by MITM equipment, and synch all the security stuffs in your network.
[1] https://en.wikipedia.org/wiki/Precision_Time_Protocol
[2] https://www.arista.com/assets/data/pdf/Whitepapers/Absolute-...
My understanding is that precise measurement of time is the basis of all other measurements: space, mass, etc. They are all defined by some unit of time. So increasing time precision increases potential precision in other measurements.
Including of course information - often defined by the presence or absence of some alterable within a specific time.
We invent new uses for things once we have them.
A fun thought experiment would be what the world would look like if all clocks were perfectly in sync. I think I'll spend the rest of the day coming with imaginary applications.
> were perfectly in sync
They couldn't stay synced. There's a measurable frequency shift from a few cm of height difference after all. Making a pair of clocks that are always perfectly in sync with each other is a major step towards Le Guin's ansible!
For other readers' info, clock stability is crucial for long-term precision measurements, with a "goodness" measured by a system's Allan variance: https://en.wikipedia.org/wiki/Allan_variance
I guess very few systems have better absolute time than a few microseconds. Those systems are probably exclusively found in HFT and experimental physics.
This past week I tried synchronizing the time of an embedded Linux board with a GPS PPS signal via GPIO. Turns out the kernel interrupt handler already delays the edge by 20 us compared to busy polling the state of the pin. Stuff then gets hard to measure at sub microsecond scales.
10 MHz reference oscillators that are GPS locked are quite common. They're very useful in RF contexts where they're quite easy to find.
From https://news.ycombinator.com/item?id=44054783 :
> "Re: ntpd-rs and higher-resolution network time protocols {WhiteRabbit (CERN), SPTP (Meta)} and NTP NTS : https://news.ycombinator.com/item?id=40785484 :
>> "RFC 8915: Network Time Security for the Network Time Protocol" (2020)
Yes, I'm aware of some of these developments. Impressive stuff, just not the level of precision on achieves tinkering for a few days with a basic gnss receiver.
Another commenter mentioned that this is needed for consistently ordering events, to which I'd add:
The consistent ordering of events is important when you're working with more than one system. An un-synchronized clock can handle this fine with a single system, it only matters when you're trying to reconcile events with another system.
This is also a scale problem, when you receive one event per-second a granularity of 1 second may very well be sufficient. If you need to deterministically order 10^9 events across systems consistently you'll want better than nanosecond level precision if you're relying on timestamps for that ordering.
Google Spanner paper has interesting stuff along these lines, heavily relied on atomic clocks
Since their precision is essential to measuring relativistic effects, I'm not sure we're near that limit.
For your precise question, it may already be there.
For most applications, clock precision of synchronization isn't really necessary. Timestamps may be used to order events, but what is important is that there is a deterministic order of events, not that the timestamps represent the actual order that the events happened.
In such systems, ntp is inexpensive and sufficient. On networks where ntpd's assumptions hold (symetric and consistent delays), sync within a millisecond is acheivable without much work.
If you need better, PTP can get much better results. A local ntpserver following GPS with a PPS signal can get slightly better results (but without PPS it might well be worse)
I know that Google's Spanner[0] uses atomic clocks to help with consistency.
[0] https://en.wikipedia.org/wiki/Spanner_(database)
It hit diminishing returns for most things long, long ago, but this physics is directly related to stuff in quantum computing and studying gravity.
They mention a "quantum noise limit", that must be the ultimate precision that is physically possible, right?
What is this ultimate precision? I imagine that at some point, even the most modest relative motion at ordinary velocities would introduce measurable time dilation at fine enough clock precision.
When I was in anthropology, many of the cultures I studied had very vague concepts of time (sunrise/sunset, passage of stars and constellations, different seasons). One of my professors spent two weeks about how time was a Western construct and how people want to go to such great lengths to have such precise measurement of it.
The very lengthy discussion around the concept was fascinating to me as a 23 year old college student who only knew it from one perspective.
> how time was a Western construct
Japan had a whole fancy temporal hour system before Western contact. It was more complicated than our modern framework, as it was based on the time between sunrise and sunset and so the length of the hours had to be adjusted about every two weeks. But they certainly thought quite a bit about it, so I'm not sure how it could be claimed to not be a concept there at the time.
How would the variable hours be used? Presumably access to the timekeeping was limited, so who even was aware of the difference so that they could modify their life to accommodate it?
We've been using sundials since antiquity. What was your professor even talking about?
This topic of time being a western construct, it's impact on society and life is one of the subjects in the excellent book "Borderliners" [1] by Peter Høeg. A favorite of mine, though I never read the English translation. It's a fascinating topic, and impacts our live more than we might be aware of or care to admit. I started thinking about it in a different way after reading this book.
[1] https://en.wikipedia.org/wiki/Borderliners
> time was a Western construct
That really doesn't seem to make sense as written. Even if for "Western" you count all the way to the Middle East (where much of our chronometry originates), there's still a lot found in China and the New World. (From what I can tell, India does not seem to have a strong independent record here? Though they certainly borrowed from the inventors, just like Europe did.)
Is this (the OT [1]) with ytterbium a more or less efficient way to count clock ticks with high precision than is described in [2]?
[1] "Quantum-amplified global-phase spectroscopy on an optical clock transition" (2025) https://www.nature.com/articles/s41586-025-09578-8
[2] "Quantum watch and its intrinsic proof of accuracy" (2022) https://journals.aps.org/prresearch/abstract/10.1103/PhysRev...
"Improve" means nothing unless you give a number.
straight from the abstract:
> we can demonstrate quantum-amplified time-reversal spectroscopy on an optical clock transition that achieves directly measured 2.4(7) dB metrological gain and 4.0(8) dB improvement in laser noise sensitivity beyond the standard quantum limit.
That isn't a comparison to the state of the art, just a naive quantum clock.