All clocks are 30 seconds late
352 comments
·January 6, 2025GuB-42
wodenokoto
I think that’s the problem with the article - that it sticks to its guns.
It starts with an outrageous statement, goes on to show that it’s actually correct. Then it relates it to similar things and instead of saying “yeah, just like we floor years and hours it makes sense to do it for minutes too, but it was fun to think about” it goes on to say “but for minutes this is bad”
If it had backtracked and said “flooring is actually the better choice” I would have appreciated the article as a whole much more
fouronnes3
Thanks for the feedback. I agree this is how I should have ended the article. If anything, the most important thing with conventions is that we all follow the same ones. So in the end I'm obviously not gonna move all my clocks forward by 30 seconds. This is just how I decided to write the article, but I concede I should have made the tongue-in-cheek tone more explicit.
mosselman
You can still change the ending. It is your article and right now quite a few people will have read it, but if you keep it live for years many more will probably get to read it and there is no reason why you should keep some first version of it online if you think it should’ve been different.
You wouldn’t leave a v1 of an app online just because it is first
atoav
It seems to me many people are so amazed by the fact they for once had an original thought that at this moment they stop caring whether it is actually a good one.
It is crucial to maintain mental flexibility and one does that by thinking things through, killing your darlings, admitting when ideas are wrong or simply just mediocre. Only because it was me who had an idea doesn't mean I have to defend it at all cost. The idea isn't me.
ForOldHack
I have been staring at this comment for over 2 hours. It is brilliant on so many levels... During that first hour, two people DMed me, and said a couple of my practices are genius.
I think that this is one of the very most important ideas I have ever read on YC:
"It is crucial to maintain mental flexibility and one does that by thinking things through."
circlefavshape
One of the big downsides of the internet is the cold water it constantly pours on my idea of my own originality. Every time I think of a great idea I find someone else has already thought of it
(well, almost every time)
teaearlgraycold
I am sure this is not an original thought.
mongol
> With flooring, if you see 13:00 you know you are late
I always though that you are late from 13:01. Common these days with Teams meetings etc. It seems most people join during the minute from 13:00 to 13:01.
viraptor
Because of how lots of reminders work. There isn't even a good way to tell Google calendar to always notify 1 minute before events - I had to do it through slack integration.
So instead the reminder usually tells you a meeting will be in 15min which quite often is a useless information. Then the app tells you the meeting started right now and you still need a few seconds to wrap things up and prepare.
vel0city
> There isn't even a good way to tell Google calendar to always notify 1 minute before events
It's on the calendar settings. Settings for my calendars > Event Notifications. You can set 5 default notification options for all events created on that calendar.
cbolton
Seems like there are important cultural differences in how appointment times are understood. Last week I was talking to a friend living in the Comoros, who mentioned that for them 13:59 is still 13:00 for this purpose.
lgrapenthin
So they ignore minutes entirely and just live in hours?
falcor84
I have friends who treat dinner party invitation times in this manner
ForOldHack
Most people are not clear on two concepts: Be prepared, and on time is late. Both of these are not math skills, they are leadership skills.
Its rather easy to establish a beat, and set two clocks to one clock ( seminar program clocks to USNO standard time, ). After a few hours, even the most inexpensive digital clocks will not vary a second, usually it takes a full day to drift that far.
Its quite uncommon for everyone to be ready, alert and available on time, even in integrity conversations.
flerchin
I tell my kids this aphorism:
Early is on-time. On-time is late.
andix
This vastly depends on culture and context.
We consider it impolite, if you show up for an invitation at someone’s home before the time you were invited for. Many would say it is even impolite to show up less then 5 minutes late, and consider being 10-15 minutes late the best, and up to 30-45 minutes acceptable.
For a business appointment or doctor's appointment, where there is an assistant that opens the door and a waiting area, it's expected to be early, so that you are already in front of the correct room when the appointment starts.
mikenew
Showing up early just makes other people feel like they did something wrong by showing up on time.
pests
Depends on the power dynamic and the goals for the meeting, and what position you hold, no?
mongol
This makes a lot of sense. But where it really matters, say train departure times, are there rules that the doors are closing precisly at X seconds? Or is it arbitrary?
anal_reactor
Early is on-time. On-time is late. Late is how most people behave.
xoxxala
We always taught our kids that if you’re not five minutes early, you’re late.
One boy took it to heart and is very prompt.
The other, eh, not so much. He was almost late to his own wedding.
Kiro
I'm usually early but I watch the preview until there are at least two other people in the call, then I join. I suspect many other do the same which sometimes results in implicit standoffs.
yzydserd
I live by joining before
:58 if presenter
:59 if core
:00 if contributor
:01 if observer
Many colleagues seem to +:01 this.
null
null
TZubiri
But if every clock was like that, then 12:59:30 would be the new 13:00:00
The_Colonel
But not every clock would be like that - only those clocks which don't show the seconds precision would use this rounding.
The consequence of that would be that statements like "fireworks start at 12 AM" would mean two different points in time depending on how much precision your clocks have.
fouronnes3
That's true if you catch the exact moment the clock changes. If you don't, the only thing you know with a truncating clock is that the fireworks started 0 to 59 seconds ago. With a rounding clock, you know the starting point is within [-30, 30] from now. So on average, you're closer to the starting point when seeing the clock show 12AM.
A good reason for truncating is that we have a strong bias against being late, but not really against being early.
soneil
So if my watch shows seconds, I'd be late at 12:59:31?
bmicraft
If someone without seconds on their clock starts the meeting, yes.
null
dylan604
How does 12:59:30 floor to 13:00:00? Wouldn't that be the result of ceil?
pkulak
Only until the next article saying that "all clocks are 0.5 seconds early" and we then switch to randomized rounding.
zzo38computer
I still think that flooring would be better; however, if you did insist to do this rounding instead then you could use a different convention for numbering seconds with e.g. -30 to +30 instead of 0 to 60. However, I think that this is not worth it, and that the existing use of flooring is much better, although if you want such precision with timing then you really should display the seconds, rather than using a clock that does not display seconds, anyways.
jsnell
> This is especially apparent when you're trying to calculate "how much time until my next meeting?", and your next meeting is at noon. If it's 11:55, you would usually mentally subtract and conclude: the meeting is in 5 minutes. That's how I always do it myself anyway! But the most probable estimate given the available information is actually 4'30"!
Ok. But what does it mean for a meeting to start at 12:00 when people don't have clocks that will accurately show 12:00? They'll only know when the time is 11:59:30 or 12:00:30, anything between is just going to be a guess. So it seems to me that the start times would just align to the half-minute offsets instead, and we'd be back exactly where we started but with more complexity.
FartyMcFarter
> If it's 11:55, you would usually mentally subtract and conclude: the meeting is in 5 minutes. That's how I always do it myself anyway! But the most probable estimate given the available information is actually 4'30"!
The way I like to think about it is "the meeting is in less than 5 minutes". Which is always correct since my reaction time to seeing the clock switching to 11:55 is greater than zero.
It could even be less than 4 minutes if it has already switched to 11:56 and I haven't had time to react to that change, but that's OK - my assessment that I have less than 5 minutes to get to the meeting is still correct.
xattt
All broadcast studios are equipped with master clocks that show seconds to deal with this ambiguity.
You can look at your own watch and anticipate when program transitions in radio or TV are supposed to take place (usually the minute and 30 second marks). Also, get a sense when a host is filling time to get to the transition.
dylan604
I've done a lot of work with hosts on various shows. One guy stood out more than others on being so natural on the vamp/stretch to fill the time. Starting at 5mins, we give one minute signals. Not once did it ever sound unnatural in trying to rush or filled with ums, uhs, or ahs. Others struggled with the rushing being most noticeable.
mb5
Jonathan Agnew has a similar story about the late, great Australian cricket commentator Richie Benaud, although filling ~52 seconds rather than 5 minutes.
xattt
> vamp
Thanks for teaching me a new term!
sigmar
This is a good point. There's tons of times when I'm watching a clock to watch for a precise moment (like buying concert tickets, limited edition merchandise, stock market opening). Losing the ability to see when a 12:00:00 happens would be annoying
timerol
If you care about starting a meeting to within better precision than a minute, use a clock that shows seconds. If I want to start a meeting at noon, I don't block off the minute display of my clock and wait for the 11 -> 12 transition to start the meeting.
caseyy
> If it's 11:55, you would usually mentally subtract and conclude: the meeting is in 5 minutes. That's how I always do it myself anyway
And if you showed up to the meeting in exactly 5 minutes, you’d be on time!
croes
> If it's 11:55, you would usually mentally subtract and conclude: the meeting is in 5 minutes.
Even that part is wrong. I guess I‘m not the only one who knows and thinks it’s less than 5 minutes.
brnt
The technically correct thing to do would be to educate on precision, perhaps even display it, such that people know that 12:00 means a time between 11:59 and 12:01, not 12:00.000.
timerol
The point is that we use 12:00 to note a time between 12:00:00.0 (inclusive) and 12:01:00.0 (exclusive). Saying that 12:00 is a time between 11:59 and 12:01 implies that the range of error is twice as big as it actually is.
How long between 12:01:00.0 and 12:00 (as read on a clock)? Between 0 and 60 seconds.
How long between 11:59:00.0 and 12:00 (as read on a clock)? Between 60 and 120 seconds.
brnt
What I am saying is that that use is incorrect as well. There is only one way to understand numbers, and that is scientifically. I.e. significant digit.
lxgr
Good luck educating people on why they should change lifelong habits that actually even make more sense most of the time too.
bena
Exactly, this article can be summed up on one sentence: "Look at me, I'm so clever"
zemnmez
I apologise for my "but, actually...":
Analogue clocks like the face of big ben are not like digital displays, and whether they "show seconds" in the context of the meaning of this article is not, like digital displays, down to whether there is a dedicated hand.
Unlike digital displays, the largest denomination hand on an analogue clock display contains all of the information that the smaller hands do (depending on the movement in some cases).
The easiest way to realise this is to imagine a clock without the minute hand. Can you tell when it's half-past the hour? You can. The hour hand is half way between the two hours.
Again, it depends on the movement, but it is not out of the question that your minute hand is moving once every second, and not every minute. It is down to the number of beats per unit time for an analogue display as to what the minimum display resolution is (regardless of if the movement is analogue or digital itself).
danieldk
Unlike digital displays, the largest denomination hand on an analogue clock display contains all of the information that the smaller hands do (depending on the movement in some cases).
You would be surprised. When I was a kid, I sometimes used to stare at the clocks with an analog face at the train station while waiting for the train to school to arrive.
Interestingly enough the seconds hand would go slightly faster than actual seconds and at the 60 seconds the seconds hand would get stuck for a moment as if it was pushing the minutes hand and then the minutes hand would flip to the next minute.
Found a video here:
https://www.youtube.com/watch?v=ruGggPYQqHI
The description describes how they work, which seems like a mixture of digital and analog (due to the use of both cogs and relays + propagation of pulses from central to local clocks), translated:
- The seconds hand makes a revolution of 57-58 seconds and is then stuck for 2-3 seconds.
- The seconds hand is driven using 230V.
- The minutes hand get a 12V or 24V pulse once every 60 seconds. The polarity has to swap every 60 seconds. The swapping of the polarity can be done using a relay or specially-made components.
- The hours hand is driven by the minutes hand using cogs.
Edit: more information and references here: https://en.wikipedia.org/wiki/Swiss_railway_clock#Technology
Doxin
The key to this mechanism is that the stepping of the minute hand is what unlocks the second hand. Pretty clever low-tech way to keep a LOT of clocks in really close sync.
Dutch train stations used to have these too, I loved to watch them in action while waiting for a train.
devnullbrain
On a wristwatch it's also easy - and probable - to set a minutes hand out-of-sync with the seconds, so they don't both line up at 12 at the hour.
eviks
Thanks for the video, what a silly design, especially given the Swiss reputation when it comes to clocks...
rob74
If you think of the design goals (synchronizing clocks across the train network) and the technology available at the time, the design is actually pretty clever. Knowing the exact second is not important - if the second hand actually completes a whole cycle in only 58 seconds, this is still good enough to be able to see how much of the minute has passed. Having the exact same minute on all clocks is much more important than that - especially since train departure times are usually "on the minute".
decentralised
>The easiest way to realise this is to imagine a clock without the minute hand.
No need to imagine it, it's been invented many years ago and it's called a perigraph. Meistersinger makes one of the nicest I've seen: https://www.relogios.pt/meistersinger-perigraph-relogio-auto...
jolmg
> it depends on the movement, but it is not out of the question that your minute hand is moving once every second, and not every minute.
I think the only place where I've seen the minute hand move by the minute has been on TV, in those climactic moments where the camera zooms in on the clock and strikes a certain time. Maybe it's a trope, for emotional tension, like mines that don't explode until you step off.
kuschku
> The easiest way to realise this is to imagine a clock without the minute hand. Can you tell when it's half-past the hour? You can. The hour hand is half way between the two hours.
Can I? Many analog clocks actually "tick" the second and minute hand. I've even seen some that tick the hour hand.
deaddodo
You literally just defined the difference between digital (binary) and analog (gradation).
A digital clock is 1:01 or 1:02. An analog clock is some tick of some range (depending on the resolution, as you abstracted), at all times.
lelandbatey
I think a slightly better term is "discrete" vs "continuous". Some analog clocks are discrete, some are continuous. Some digital clocks operate on a resolution so fine that they appear to move continuously. It's quite lovely to find those that invert your expectations when out in the real world.
hunter2_
It's a bit more than that:
There are analog clocks where all hands move continuously (like when there's a second hand with no discernable beats). There are analog clocks where all hands move discreetly once per second (60 BPM for all hands). There are analog clocks where the minute hand moves at 1 BPM (quantized to the floor of each minute) while the second hand does something else (perhaps discrete movement at 60 BPM, or perhaps continuous other than a pause at the top of each minute, etc.). And there are digital clocks!
null
deaddodo
You're correct, thanks for the clarification. I was going more with the colloquial understanding of the two (analog = continuous; digital = discrete) and was trying to touch on the vagueness of no true analog clock with the reference to ticks/resolution.
However, your explanation is definitely much better.
eviks
All universal statements like this are wrong and stem from basic ignorance
> So when it's actually 14:15:45, they'll show 14:15. And when the actual time goes from 14:15:59 to 14:16:00, then that's when your clock changes from 14:15 to 14:16.
No, that's a silly mistake, look at the picture (though much better - video) of the analogue clock to see it's not the case, the minutes hand moves continuously, so isn't at 15 at 15:59
> the meeting is in 5 minutes.
That's not the only question we ask about time. Has the meeting/game already started? You can't answer that with an average value
> for some context appropriate reason) you reply with just hours, you would say it's 11!
No, you reply would depend on the context, which you haven't specified.
> Please someone tell me I'm not crazy
Of course not, just trying to replace one ambiguity with another. Maybe instead come up with a more precise minutes display in digital clocks that adds more info like two dots flashing when it's past 30 sec and only 1 dot when it's before? (In phones you could use color or a few more pixels within all the padding?)
verzali
> All universal statements like this are wrong and stem from basic ignorance
animuchan
Yup, I think the "has the {thing} already started" is, for many people, the most useful function of precise time anyway. All sorts of work and personal meetings, transportation schedules, doctor's appointments, and so on.
Knowing the ballpark in the form of "it's 15:30-ish", even if more precise, is strictly less useful than "you're late to the 15:30 meeting with your manager".
Fun article nonetheless, and interesting perspectives on both sides!
hartator
Interesting take!
Both of your pic examples are wrong though. That digital clock does show seconds and the London clock has its minute hand in between minute mark - showing progres between minute mark if you look closely. This is the same for all analogue clocks.
crgk
I’m ready for the rebuttal post: “different clocks have different approaches to conveying information about seconds within a minute” which uses the same photos as examples.
fouronnes3
No my digital clock doesn't show seconds. As for the london one, I was actually wondering about that! I know it depends, because some analog clocks work like digital one and snap to the next minute by discrete increments.
necovek
Most analog (really, with a geared mechanism) clocks do not "snap" on exact minutes but slowly drive toward them (because that's simpler and thus cheaper).
undersuit
Analog Clock movements with second hands! The seconds hand is rarely smooth, we want the tick, but the minute hand and hour hands are smooth.
TeMPOraL
Except those of us (like my SO) who are bothered by the ticking sound at the edge of audibility, and prefer smooth seconds motion.
timw4mail
Unless it's an old AC-motor clock
kevin_thibedeau
All escapement driven clocks are discrete.
donkers
This is true unless you look at something like a Seiko Spring Drive, which has a completely smooth second hand sweep, although it’s not entirely mechanical (and maybe only watch nerds care about this)
https://www.hodinkee.com/articles/does-spring-drive-have-an-...
michaelcampbell
Pedantically, only above second granularities. They're continuous between second hand sweep movements at subsecond ones, no? I mean, there's no point on the watch that the second hand doesn't "hit" at some point, however small.
Or am I wrong that "intermittent, jerky, continuity is still continuity"?
tavavex
Not all analogue clocks smoothly move the minute hand to show progress in the current minute. Many of them tick over, truncating the information to the minute like what digital clocks do.
BenjiWiebe
I have never seen a round clock with hands that ticks over a minute! And I look at clocks. Most that have second hands tick over seconds, though.
Where do you live?
aidenn0
In movies when the villain has placed the hero in the mechanism of a clock-tower, the minute hand seems to always tick over a minute. I don't recall ever seeing it in real-life, but I don't look at clocks in clock-towers that often.
JonathonW
I have a round analog clock with a particularly strange arrangement: it has a second hand (that ticks every second), and it has a minute hand that only moves every fifteen seconds.
(It's a radio-controlled clock: it has the second and minute hand on separate motors presumably because syncing to the actual time if there were only a motor for the second hand like a conventional analog clock would take too long (and probably make determining position more complicated). There is no independent motor for the hour hand, so it does have to roll the minute hand around to move that one.)
Kirby64
There’s some very neat designs that only tick the minute hand once per minute, as it’s significantly more power efficient to do so. You just power the hand once per minute, as opposed to continuously driving the hand in small increments.
perryizgr8
I see these clocks often in railway stations (I live in India). There is no seconds hand. The minute and hour hands move in clicks, not smoothly like most clocks.
michaelcampbell
Maybe; I think of analog clocks as ones with an analog, continuous mechanism. As such they happen to use a sweep hand display.
Quartz and the like CAN also use non-digital displays, but I wouldn't consider them analog timekeepers.
lupire
Quartz is an analog mechanism like a pendulum. It doesn't stop at each cycle.
roywiggins
Vaguely related: I don't think people are being taught how to read analog clock faces nearly as much anymore, and apparently phrases like "quarter past ten" are becoming, so to speak, anachronisms.
Suppafly
The only one that I know of as being an anachronism is saying "quarter of" or similar. At one point people decided that 'of' meant 'to' and after a while we forgot that because it was stupid. People still say quarter past though.
aidenn0
My teenage daughter needs me to explain it to her every time I say either "quarter past" or "quarter to"
doubled112
Also vaguely related, I've come to realize some people find metric measurements easier than feet and inches.
I find the fractions simpler. Need a half of that half? Just double the denominator.
My wife would seemingly rather keep counting .1 centimeters.
The same applies with clocks. It's easier for me to rough out how long I have if I just chop the face into fractions vs mental arithmetic, as brutal as that sounds. What do you mean this guy can't divide 30 in half?
dredmorbius
If only we were a species with six fingers per hand.
BlueTemplar
Romans were using both : "metric/decimal" for numbers, but "imperial/dozenal" for fractions :
https://en.m.wikipedia.org/wiki/Roman_abacus
(Base 12x5=60 was of course used by their Babylonian predecessors.)
Gormo
But 15 minutes is a quarter of an hour regardless of whether you are using an analog or a digital clock to read the time.
dredmorbius
The spatial representation on an analogue clockface is far more evident. Each 15 minute interval sweeps out a quarter of the face with the minute hand.
marpstar
a "quarter" means 1/4th -- it's a "quarter" turn of rotation on a physical clock, but 60/4 is always 15.
or were you making the distinction between "quarter past" and "quarter after", because I'd agree that the former is a lot less common.
roywiggins
I'm not sure, but either way, "it's a quarter past 6" has gotten me blank stares.
null
lupire
Technology Connections did it:
https://m.youtube.com/watch?v=NeopkvAP-ag
Bonus on analog vs digital mechanism in flip clocks:
dmd
Public schools here in suburban Boston MA still teach analog first.
aidenn0
So do the public schools here, and we have 3 analog clocks in my house, but 3/4 of my children cannot read an analog clock, and 2/4 of them do not understand me when I say "quarter past" or "quarter to" no matter how many times I explain it.
ta1243
Don't kids know how to tell the time before they go to school?
dboreham
This is regional. US never used quarters afaik.
dredmorbius
No, its absolutely widespread that they do.
You can confirm this by searching "quarter past" at any significant US website, e.g., the NY Times:
<https://duckduckgo.com/?q=site%3Anytimes.com+%22quarter+past>
"Quarter of" or "quarter to" are less frequent, but can be found and heard.
Gormo
What do you mean? People in the US routinely use "quarter after" and "quarter to" when telling time.
JohnBooty
I've only ever lived in NE USA, but I have traveled, and I definitely don't think it's regional.
Generational though, sure.
scrozier
Oh yes, I grew saying "quarter past four." Probably don't anymore, but it was definitely in the vernacular in the US in years past.
throwaway519
I feel you're missing the elephant in the room with the clock observation:
Time is a cube, not a cuboid.
PeterCorless
Yes. Though analog second hands often "tick" the seconds. (Some move the second hand smoothly.)
RIMR
>That digital clock does show seconds
It most certainly does not.
I see HH:MM, temperature in Celsius, humidity in percent, alarm status, alarm time, day of the week, and DD/MM. None of those are seconds. It is a truncating digital clock that rounds down.
>the London clock has its minute hand in between minute mark - showing progres between minute mark if you look closely.
"If you look closely" isn't really how analog clocks work in practice. Without a second hand, the limits of human vision prevent us from fully calculating the time between minutes, as each second only represents a 0.1° change in angle of the minute hand, and most mechanical analog clocks aren't designed for the minute hand to move perfectly linearly between minutes.
null
keskival
It's not really about flooring or rounding, but whether one thinks of time indices as ranges or moments.
Days, as the author points out, are though of with "flooring", but more accurately it could be said that a date is thought of as a range between the times belonging to the date.
Minutes one can consider as ranges or time indices. There the error comes, in switching the interpretation of a start of a duration to an actual estimate of a point of time index.
ASalazarMX
A minute is an insignificant period for most daily tasks, so the convention "show me when the minute changes" is simple and pragmatic. If your task needs precise count of seconds, you get a clock that shows when the second changes, and now you are half a second late on average.
You can keep playing with increasingly smaller time units until you conclude, like Zeno's arrow paradox, that you're always infinitely late.
msm_
Pointless remark about myself, but I always set my phone's clock to second precision (I think this setting is hidden somewhere, or even needs a third-party app to unlock), and I am annoyed there's no way to do this on the lockscreen. How is it possible that nobody else (apparently) wants it, and it's not the Android default? Why would I want a clock that is, on average, a half minute off?
ploynog
> Why would I want a clock that is, on average, a half minute off?
Because in 99.9% of the cases I don't care about the seconds, it takes away space in the top status bar, and the constant changing of seconds in the top-left corner of the screen is distracting. And for the remaining 0.1% of cases, there is the clock app that shows seconds.
What benefit do you gain in daily life by having the time down to the second? The argument "so it's not half a minute off on average" seems a bit self-referential.
iggldiggl
> and I am annoyed there's no way to do this on the lockscreen
With some OEMs there is (personally I know that current-ish Sony phones offer a corresponding option), but yeah, it is a bit annoying that that isn't universal… part of the reason I still carry a regular watch.
derbOac
I think that's about right.
Another way of thinking about this is that the author is confusing time as measurement (how much time) with time as rule (what time is it). If you wanted to measure the duration as a difference in clock times, yes, there would be a certain amount of measurement error incurred by the way clocks are displayed. But if you want to know the time, in the sense of whether a certain time has been reached, or a certain graduation has been crossed, it doesn't make sense to round to the nearest minute.
The question of "how much is this clock off?" is only meaningful with reference to a certain use or interpretation of the numbers being displayed. If you say it's "8:56" people know it could be anything up to but not including 8:57, but greater than or equal to 8:56. The number means a threshold in time, not a quantity.
Ukv
I don't think this applies to Elizabeth Tower/Big Ben, as it's an analogue clock and, from footage I can find[0], its minute hand appears to move continuously opposed to in steps. (or at least, not in full-minute steps)
Also, I believe it's wrong to say "the average error would be 0" if rounding to nearest minute. The average offset would be 0, but the average error would be 15, to my understanding.
fouronnes3
That's a good point, I was actually wondering about that. I've seen a lot of jumping analog clocks so I incorrectly assumed Big Ben was the same. I should have checked :)
tim333
The jumping ones are mostly electrically activated. Big Ben from 1854 is gloriously mechanical - some of the mechanism: https://www.youtube.com/watch?v=G-WSzdAF8b8 pendulum https://youtu.be/U8gkgWoFBAw guy who winds it https://youtu.be/dMT-OOrLBik
jtbayly
I can get behind this idea, however, this sentence is wrong:
"If clocks rounded to the nearest minute instead of truncating, the average error would be 0.”
The negative and the positive error don’t cancel each other out. They are both error. The absolute value needs to be used.
lavelganzu
Good catch. RMS (root mean square) error is typical in signal processing to avoid this undesirable cancellation.
Straw
The average error is in fact 0! The average absolute error is reduced but not 0.
jy14898
By this logic, a broken 24 hour clock stuck at mid day has 0 error.
jtbayly
That may technically be correct, but it is incorrect in the real world. I submit that error is error in the real world. Mathematics can go jump off a cliff unless it wants to be helpful. :)
roywiggins
Zero average error conveys something important though: the error that there is, isn't biased positive or negative.
pkilgore
That's language failing us, not maths :-)
noqc
What are you talking about? Error is a metric.
fouronnes3
That's a very fair nitpick, but even with a more rigorous error function the point still stands, I think.
jtbayly
Agreed. There will be less error, just not zero. I thought it was a silly error that detracted from the point, rather than defeated the point.
lupire
It depends on the application. Are you summing times (as with a pay clock at a job), or are you paying for error in both directions for some reason?
mglz
Hä?! A clock shows the period of time we are currently in. A clock only showing hours would for example indicate that we are in the 14th hour of the day, for the entire duration of that hour. That is not an error. Similarly, a hh:mm clock will show the hour and minute we are currently in for the duration of that minute.
No clock can display the exact current moment of time. That would require infinite digits, and even then those will be late, since lightspeed will ensure you recieve the femtoseconds and below really late.
v4vvdq
An analog clock does show the exact current moment of time (if the hands move in a linear motion and don't jump).
cess11
What time it is, is just made up, it's something we can decide freely through social power, as evidenced by timezones and daylight savings and leap seconds.
Commonly the resolution is something like minutes or a few of them, that's the margin we'll typically accept for starting meetings or precision in public transport like buses.
The utility of femtoseconds in deciding what time it is seems pretty slim.
kypro
Yeah, I think labelling it "error" is a bit of a strange way to look at it to be honest.
It's only error if you're trying to measure time in seconds, but are doing it with a clock that can only measure hours and minutes. If you want to know the current minute, then a clock that can measure minutes it is 100% correct.
It's an interesting thought experiment, but really all it's saying is that half of the time 10:00 is closer to 10:01:00 than 10:00:00, but this imply you care about measuring time to the second which prompts the question why it's being measured in minutes?
To be charitable, I suppose in the real world we might occasionally care about how close 10:00 is to 10:01 in seconds, but the time shown on phones can't tell us that so on average we will be about 30 seconds out.
epcoa
The fallacy is it’s a leap from logic to go from “average error is x” is the same as “is x late”. Seeing the exact transition is often if not more useful than minimizing average displayed error.
macleginn
This is nitpicking, but the transition from "the average error of a truncating clock is 30 seconds" to "therefore all clocks are 30 seconds late!" is seriously wrong. For one, the median is equal to the mean here, so about half of all clocks are less than 30 seconds late, which is a clear contradiction.
lxgr
> Basically I'm arguing that rounding for clocks would be more useful than flooring. This is especially apparent when you're trying to calculate "how much time until my next meeting?"
Yet a rounding clock provides no way at all for you to know whether the meeting has already started or not.
Not sure where I've heard this, but an idea that's been stuck in my head is this: We don't look at clocks to see what time it is, we do so to know what time it isn't yet:
Have I missed the bus yet? Can I already go home? Am I late for this meeting? Do I still have time to cancel this cronjob? All questions that a rounded clock cannot precisely answer.
seanhunter
Those are all questions no clock can answer though. They require state from the physical world over and above knowing the current time.
It’s a wild misconception to think that a flooring clock is somehow more late than a rounding clock and it’s if anything an even more crazy misconception to think that a clock can tell you whether something in the physical world has or hasn’t happened.
Flooring makes more sense in every case, from years to milliseconds and more. A few reasons:
You want to send a message at exactly 13:00:00, you have a digital clock that doesn't show seconds. What you do is that you watch your clock, and as soon as it goes from 12:59 to 13:00, you press the button. That's also how you set the time precisely by hand. With rounding, that would be 12:59:30, who wants to send a message at precisely 12:59:30 ?
You have a meeting at 13:00:00, you watch the clock to see if you are early or late. With flooring, if you see 13:00, you know you are late. With rounding, you are not sure.
It is common for digital clocks to have big numbers for hours and minutes and small numbers for seconds. If you are not interested in seconds, you can look quickly from afar and you have your usual, floored time. If you want to be precise, you look closer and get your seconds. Rounding the minutes would be wrong, because it wouldn't match the time with seconds. And you don't want clocks with a small display for seconds you may not even see to show a different time than those that don't.
And if you just want to know the approximate time and don't care about being precise to the second, then rounding up or down doesn't really matter.