I’ve spent a decent chunk of my career wrestling with time sync — NTP/PTP, GPS, timezones, all that fun stuff. For real world network time infrastructure, where do we actually hit diminishing returns with clock precision? Like, at what point does making clocks more precise stop helping in practice?
Asking partly out of curiosity, I have been toying with a future pet project ideas around portable atomic clocks, just to skip some of the headaches of distributed time sync altogether. Curious how folks who’ve worked on GPS or timing networks think about this.
For network stuff, high security and test/measurement networked systems use precision time protocol [1], which adds hardware timestamps as the packets exit the interface. This can resolve down to a couple nanoseconds for 10G [2], but can get down to picosecond. The "Grandmaster" clock uses GPS/atomic clocks.
For test and measurement, it's used for more boring synchronization of processes/whatever. For high security, with minimal length/tight cable runs, you can detect changes in cable length and latency added by MITM equipment, and synch all the security stuffs in your network.
and that precision is really important. For instance, when working with networked audio, which usually has a temporal resolution of packets that can be between 100us and 10ms (so abysmally slow in computer-time), non-PTP network cards are basically unusable.
I use fairly precise time but that's because I control high speed machinery remotely. The synchronization is the important part (the actual time doesn't matter). At 1200 inches per minute, being a millisecond off will put a noticeable notch in a piece.
Ptp and careful hardware configuration keeps things synced to within nanoseconds
My understanding is that precise measurement of time is the basis of all other measurements: space, mass, etc. They are all defined by some unit of time. So increasing time precision increases potential precision in other measurements.
Including of course information - often defined by the presence or absence of some alterable within a specific time.
We invent new uses for things once we have them.
A fun thought experiment would be what the world would look like if all clocks were perfectly in sync. I think I'll spend the rest of the day coming with imaginary applications.
This is true, but atomic clocks are about a million times more accurate than any other measurement device. For all pratical purposes, they are never the limiting factor.
They couldn't stay synced. There's a measurable frequency shift from a few cm of height difference after all. Making a pair of clocks that are always perfectly in sync with each other is a major step towards Le Guin's ansible!
For other readers' info, clock stability is crucial for long-term precision measurements, with a "goodness" measured by a system's Allan variance:
https://en.wikipedia.org/wiki/Allan_variance
For most applications, clock precision of synchronization isn't really necessary. Timestamps may be used to order events, but what is important is that there is a deterministic order of events, not that the timestamps represent the actual order that the events happened.
In such systems, ntp is inexpensive and sufficient. On networks where ntpd's assumptions hold (symetric and consistent delays), sync within a millisecond is acheivable without much work.
If you need better, PTP can get much better results. A local ntpserver following GPS with a PPS signal can get slightly better results (but without PPS it might well be worse)
I guess very few systems have better absolute time than a few microseconds. Those systems are probably exclusively found in HFT and experimental physics.
This past week I tried synchronizing the time of an embedded Linux board with a GPS PPS signal via GPIO. Turns out the kernel interrupt handler already delays the edge by 20 us compared to busy polling the state of the pin. Stuff then gets hard to measure at sub microsecond scales.
If your board’s SOC has a general purpose timer (GPT) then you can often have it count cycles of a hardware clock and store the value every interrupt pulse from a GPIO. I designed a GPS disciplined oscillator like this where we had an ADC generate a tuning voltage for a 100 MHz OCXO (which was a reference clock for microwave converters) which we divided down to 10 kHz and fed into the GPT, along with the 1pps from a GPS module, and the control loop would try to adjust it until we got 10K clock cycles for every pulse. This kind of synchronisation gets very accurate over a few minutes.
Even just triggering an GPT from an GPS PPS input counting cycles of an internal clock you could use a GPT to work out the error in the clock, and you only need to query it once a second.
Intel Ethernet pps input pin works much much better for this. See how the open timecard mini folks do it. Easy to get sub microsecond even on cheap embedded. Most m.2 of Intel chipsets expose it (for example) as well.
Sure, I was specifically talking about computer system clocks. Also with an oscillator _absolute_ time offset doesn't matter, unless you want to synchronize the phase of distributed oscillators and then things quickly get non-trivial again.
Yes, I'm aware of some of these developments. Impressive stuff, just not the level of precision on achieves tinkering for a few days with a basic gnss receiver.
Another commenter mentioned that this is needed for consistently ordering events, to which I'd add:
The consistent ordering of events is important when you're working with more than one system. An un-synchronized clock can handle this fine with a single system, it only matters when you're trying to reconcile events with another system.
This is also a scale problem, when you receive one event per-second a granularity of 1 second may very well be sufficient. If you need to deterministically order 10^9 events across systems consistently you'll want better than nanosecond level precision if you're relying on timestamps for that ordering.
They mention a "quantum noise limit", that must be the ultimate precision that is physically possible, right?
What is this ultimate precision? I imagine that at some point, even the most modest relative motion at ordinary velocities would introduce measurable time dilation at fine enough clock precision.
Yes, there is a limit called "quantum projection noise" that determines how much frequency stability one can achieve with a single-atom clock [1]. With N independent atoms, this limit gets smaller by 1/sqrt(N), but with N entangled atoms one can achieve a 1/N scaling. This is the ultimate limit (Heisenberg limit).
> we can demonstrate quantum-amplified time-reversal spectroscopy on an optical clock transition that achieves directly measured 2.4(7) dB metrological gain and 4.0(8) dB improvement in laser noise sensitivity beyond the standard quantum limit.
When I was in anthropology, many of the cultures I studied had very vague concepts of time (sunrise/sunset, passage of stars and constellations, different seasons). One of my professors spent two weeks about how time was a Western construct and how people want to go to such great lengths to have such precise measurement of it.
The very lengthy discussion around the concept was fascinating to me as a 23 year old college student who only knew it from one perspective.
Japan had a whole fancy temporal hour system before Western contact. It was more complicated than our modern framework, as it was based on the time between sunrise and sunset and so the length of the hours had to be adjusted about every two weeks. But they certainly thought quite a bit about it, so I'm not sure how it could be claimed to not be a concept there at the time.
How would the variable hours be used? Presumably access to the timekeeping was limited, so who even was aware of the difference so that they could modify their life to accommodate it?
Your professor was just wrong if they claimed time was a Western construct. Calendars, sundials, and other time keeping devices were created independently around the world.
All you need to create a clock is realize that your oil lamp consumes its fuel in a somewhat consistent interval, or a similar observation for the time it takes drips of water to fill a cup. People figured it out.
I don't know how other cultures viewed time, but i think there is a big difference between being able to make a clock and running your life by a clock. Modern industrialized society is very regimented - work starts at an exact time, ends at an exact time, lunch is exactly an hour, etc. I suspect such notions would be much less useful in a non-industrial society.
This topic of time being a western construct, it's impact on society and life is one of the subjects in the excellent book "Borderliners" [1] by Peter Høeg. A favorite of mine, though I never read the English translation.
It's a fascinating topic, and impacts our live more than we might be aware of or care to admit. I started thinking about it in a different way after reading this book.
That really doesn't seem to make sense as written. Even if for "Western" you count all the way to the Middle East (where much of our chronometry originates), there's still a lot found in China and the New World. (From what I can tell, India does not seem to have a strong independent record here? Though they certainly borrowed from the inventors, just like Europe did.)
It wasn't even a daily utilized concept in the West until trains were chugging their way to various stations. Farmers didn't need to know the exact minute the cows came home.
I don't think its fair to say that atomic clocks represent a western cultural value. After all, they are extremely niche. Physicists care but the average "westerner" does not.
And paper money is a Chinese invention. Doesn't mean it's worthwhile to spend two weeks in an anthropology class talking about how much awesomer they are.
I’ve spent a decent chunk of my career wrestling with time sync — NTP/PTP, GPS, timezones, all that fun stuff. For real world network time infrastructure, where do we actually hit diminishing returns with clock precision? Like, at what point does making clocks more precise stop helping in practice?
Asking partly out of curiosity, I have been toying with a future pet project ideas around portable atomic clocks, just to skip some of the headaches of distributed time sync altogether. Curious how folks who’ve worked on GPS or timing networks think about this.
For network stuff, high security and test/measurement networked systems use precision time protocol [1], which adds hardware timestamps as the packets exit the interface. This can resolve down to a couple nanoseconds for 10G [2], but can get down to picosecond. The "Grandmaster" clock uses GPS/atomic clocks.
For test and measurement, it's used for more boring synchronization of processes/whatever. For high security, with minimal length/tight cable runs, you can detect changes in cable length and latency added by MITM equipment, and synch all the security stuffs in your network.
[1] https://en.wikipedia.org/wiki/Precision_Time_Protocol
[2] https://www.arista.com/assets/data/pdf/Whitepapers/Absolute-...
and that precision is really important. For instance, when working with networked audio, which usually has a temporal resolution of packets that can be between 100us and 10ms (so abysmally slow in computer-time), non-PTP network cards are basically unusable.
I use fairly precise time but that's because I control high speed machinery remotely. The synchronization is the important part (the actual time doesn't matter). At 1200 inches per minute, being a millisecond off will put a noticeable notch in a piece.
Ptp and careful hardware configuration keeps things synced to within nanoseconds
My understanding is that precise measurement of time is the basis of all other measurements: space, mass, etc. They are all defined by some unit of time. So increasing time precision increases potential precision in other measurements.
Including of course information - often defined by the presence or absence of some alterable within a specific time.
We invent new uses for things once we have them.
A fun thought experiment would be what the world would look like if all clocks were perfectly in sync. I think I'll spend the rest of the day coming with imaginary applications.
This is true, but atomic clocks are about a million times more accurate than any other measurement device. For all pratical purposes, they are never the limiting factor.
> were perfectly in sync
They couldn't stay synced. There's a measurable frequency shift from a few cm of height difference after all. Making a pair of clocks that are always perfectly in sync with each other is a major step towards Le Guin's ansible!
For other readers' info, clock stability is crucial for long-term precision measurements, with a "goodness" measured by a system's Allan variance: https://en.wikipedia.org/wiki/Allan_variance
For most applications, clock precision of synchronization isn't really necessary. Timestamps may be used to order events, but what is important is that there is a deterministic order of events, not that the timestamps represent the actual order that the events happened.
In such systems, ntp is inexpensive and sufficient. On networks where ntpd's assumptions hold (symetric and consistent delays), sync within a millisecond is acheivable without much work.
If you need better, PTP can get much better results. A local ntpserver following GPS with a PPS signal can get slightly better results (but without PPS it might well be worse)
I guess very few systems have better absolute time than a few microseconds. Those systems are probably exclusively found in HFT and experimental physics.
This past week I tried synchronizing the time of an embedded Linux board with a GPS PPS signal via GPIO. Turns out the kernel interrupt handler already delays the edge by 20 us compared to busy polling the state of the pin. Stuff then gets hard to measure at sub microsecond scales.
If your board’s SOC has a general purpose timer (GPT) then you can often have it count cycles of a hardware clock and store the value every interrupt pulse from a GPIO. I designed a GPS disciplined oscillator like this where we had an ADC generate a tuning voltage for a 100 MHz OCXO (which was a reference clock for microwave converters) which we divided down to 10 kHz and fed into the GPT, along with the 1pps from a GPS module, and the control loop would try to adjust it until we got 10K clock cycles for every pulse. This kind of synchronisation gets very accurate over a few minutes.
Even just triggering an GPT from an GPS PPS input counting cycles of an internal clock you could use a GPT to work out the error in the clock, and you only need to query it once a second.
Sorry that should be “had a DAC generate the turning voltage”, not ADC!
Intel Ethernet pps input pin works much much better for this. See how the open timecard mini folks do it. Easy to get sub microsecond even on cheap embedded. Most m.2 of Intel chipsets expose it (for example) as well.
10 MHz reference oscillators that are GPS locked are quite common. They're very useful in RF contexts where they're quite easy to find.
Sure, I was specifically talking about computer system clocks. Also with an oscillator _absolute_ time offset doesn't matter, unless you want to synchronize the phase of distributed oscillators and then things quickly get non-trivial again.
From https://news.ycombinator.com/item?id=44054783 :
> "Re: ntpd-rs and higher-resolution network time protocols {WhiteRabbit (CERN), SPTP (Meta)} and NTP NTS : https://news.ycombinator.com/item?id=40785484 :
>> "RFC 8915: Network Time Security for the Network Time Protocol" (2020)
Yes, I'm aware of some of these developments. Impressive stuff, just not the level of precision on achieves tinkering for a few days with a basic gnss receiver.
Another commenter mentioned that this is needed for consistently ordering events, to which I'd add:
The consistent ordering of events is important when you're working with more than one system. An un-synchronized clock can handle this fine with a single system, it only matters when you're trying to reconcile events with another system.
This is also a scale problem, when you receive one event per-second a granularity of 1 second may very well be sufficient. If you need to deterministically order 10^9 events across systems consistently you'll want better than nanosecond level precision if you're relying on timestamps for that ordering.
Google Spanner paper has interesting stuff along these lines, heavily relied on atomic clocks
Since their precision is essential to measuring relativistic effects, I'm not sure we're near that limit.
For your precise question, it may already be there.
I know that Google's Spanner[0] uses atomic clocks to help with consistency.
[0] https://en.wikipedia.org/wiki/Spanner_(database)
It hit diminishing returns for most things long, long ago, but this physics is directly related to stuff in quantum computing and studying gravity.
> where do we actually hit diminishing returns with clock precision?
ah yes - that would be Planck's second which can be derived from Planck's constant and the speed of light
They mention a "quantum noise limit", that must be the ultimate precision that is physically possible, right?
What is this ultimate precision? I imagine that at some point, even the most modest relative motion at ordinary velocities would introduce measurable time dilation at fine enough clock precision.
Yes, there is a limit called "quantum projection noise" that determines how much frequency stability one can achieve with a single-atom clock [1]. With N independent atoms, this limit gets smaller by 1/sqrt(N), but with N entangled atoms one can achieve a 1/N scaling. This is the ultimate limit (Heisenberg limit).
[1] https://journals.aps.org/pra/abstract/10.1103/PhysRevA.47.35...
Not just time dilation, at those scales time apparently can flow backwards for a bit!
https://arxiv.org/abs/2409.03680
Is this (the OT [1]) with ytterbium a more or less efficient way to count clock ticks with high precision than is described in [2]?
[1] "Quantum-amplified global-phase spectroscopy on an optical clock transition" (2025) https://www.nature.com/articles/s41586-025-09578-8
[2] "Quantum watch and its intrinsic proof of accuracy" (2022) https://journals.aps.org/prresearch/abstract/10.1103/PhysRev...
"Improve" means nothing unless you give a number.
straight from the abstract:
> we can demonstrate quantum-amplified time-reversal spectroscopy on an optical clock transition that achieves directly measured 2.4(7) dB metrological gain and 4.0(8) dB improvement in laser noise sensitivity beyond the standard quantum limit.
That isn't a comparison to the state of the art, just a naive quantum clock.
When I was in anthropology, many of the cultures I studied had very vague concepts of time (sunrise/sunset, passage of stars and constellations, different seasons). One of my professors spent two weeks about how time was a Western construct and how people want to go to such great lengths to have such precise measurement of it.
The very lengthy discussion around the concept was fascinating to me as a 23 year old college student who only knew it from one perspective.
> how time was a Western construct
Japan had a whole fancy temporal hour system before Western contact. It was more complicated than our modern framework, as it was based on the time between sunrise and sunset and so the length of the hours had to be adjusted about every two weeks. But they certainly thought quite a bit about it, so I'm not sure how it could be claimed to not be a concept there at the time.
How would the variable hours be used? Presumably access to the timekeeping was limited, so who even was aware of the difference so that they could modify their life to accommodate it?
Your professor was just wrong if they claimed time was a Western construct. Calendars, sundials, and other time keeping devices were created independently around the world.
All you need to create a clock is realize that your oil lamp consumes its fuel in a somewhat consistent interval, or a similar observation for the time it takes drips of water to fill a cup. People figured it out.
I don't know how other cultures viewed time, but i think there is a big difference between being able to make a clock and running your life by a clock. Modern industrialized society is very regimented - work starts at an exact time, ends at an exact time, lunch is exactly an hour, etc. I suspect such notions would be much less useful in a non-industrial society.
The utility of an idea is not the idea itself.
This topic of time being a western construct, it's impact on society and life is one of the subjects in the excellent book "Borderliners" [1] by Peter Høeg. A favorite of mine, though I never read the English translation. It's a fascinating topic, and impacts our live more than we might be aware of or care to admit. I started thinking about it in a different way after reading this book.
[1] https://en.wikipedia.org/wiki/Borderliners
> time was a Western construct
That really doesn't seem to make sense as written. Even if for "Western" you count all the way to the Middle East (where much of our chronometry originates), there's still a lot found in China and the New World. (From what I can tell, India does not seem to have a strong independent record here? Though they certainly borrowed from the inventors, just like Europe did.)
We've been using sundials since antiquity. What was your professor even talking about?
It wasn't even a daily utilized concept in the West until trains were chugging their way to various stations. Farmers didn't need to know the exact minute the cows came home.
I don't think its fair to say that atomic clocks represent a western cultural value. After all, they are extremely niche. Physicists care but the average "westerner" does not.
And paper money is a Chinese invention. Doesn't mean it's worthwhile to spend two weeks in an anthropology class talking about how much awesomer they are.