Why are 2025/05/28 and 2025-05-28 different days in JavaScript?
156 comments
·May 28, 2025Y_Y
tluse
Pro tip: never ever use anything but ISO dates in UTC tz unless you're displaying it for a user in a UI.
names_are_hard
What if you're storing a calendar date, such as a birthday? A timestamp is inappropriate, and it's meaningless to discuss timezones in this context.
(Example - you want to know if a person is old enough to buy cigarettes, and you need to store a birthday that you can compare against the current day to see if they're legally 18 - if you store an epoch at UTC, do you store the time of day they were born? That's not the way the law works. Do you store midnight UTC? If they're currently in NY, can they but cigarettes at 7pm the day before their birthday because they're currently 18 in London?)
Sometimes you need a logical calendar date, not a point in time in the history of the universe.
guffins
That’s a great question. ISO 8601 doesn’t allow timezone offsets on date-only strings.
If you were born in the US, can you buy cigarettes at 12:00 am on your 18th birthday in London?
I’ve never heard of age verification laws caring what timezone you were born in. In fact, you couldn’t even pinpoint this from many people’s ID cards. Plenty of US states span multiple time zones, and I wouldn’t be that surprised if there were a maternity ward out there sitting on a TZ line.
rubslopes
All solutions have problems, but I think UTC midnight is simpler than dealing with mixed date formats in the backend.
mytailorisrich
If you want to store a date you don't need to store a time, time zone, etc. and your question goes away.
Certainly if you want to store birth dates and do age verification there is no point bothering with these issues, just store calendar date. Trivial to get date for age limit purposes.
fnord123
RFC3339.
ISO 8601 allows for 2 or 6 digit years. Truncating to 2 is just incorrect, and 6 digits is absurd. And you can read the RFC without paying ISO - and you can discuss the RFC with people who have also read it instead of relying on people using the Wikipedia page to interpret and explain ISO 8601.
I have a scheduling service at work and I keep getting requests for implementing ISO 8601 timestamps but I ignore them. RFC3339 is the way forward.
jcul
There's this handy venn diagram that I've seen floating around for a long time.
Just found a random link to it with an image search:
https://gyazo.com/d8517f72e24c38f055e17182842b991c/max_size/...
ISO 8601 does have some strange formats...
Y_Y
I believe two-digit years haven't been allowed for a while:
> ISO 8601:2000 allowed truncation (by agreement), where leading components of a date or time are omitted. Notably, this allowed two-digit years to be used as well as the ambiguous formats YY-MM-DD and YYMMDD. This provision was removed in ISO 8601:2004.
(That's from https://en.wikipedia.org/wiki/ISO_8601 - I don't have the standards handy, ironically.)
Honestly I'm happy with either the RFC or ISO, but it seems like most normies haven't heard of RFCs so ISO is my default.
account42
Plus RFC3339 allows you to use a space instead of the ugly T delimiter between the date and time.
realaleris149
> 6 digit years
Totally insufficient for capturing important future events like the dead of the sun.
null
nssnsjsjsjs
And unless you need the original date, time and time zone.
Conversion to UTC is not injective e.g. when clocks change or politics happen
null
davejohnclark
There's a good post about why this isn't as foolproof an approach as it might first seem here https://codeblog.jonskeet.uk/2019/03/27/storing-utc-is-not-a...
christina97
The post doesn’t account for the case where a suburb of Amsterdam breaks off from the rest of Netherlands and changes timezones… Then how do you know if the event was in the neighborhood or not?
My point is that this is an extremely niche case and works around one particular type of timezone insanity. You either have a team dedicated to dealing with timezone insanity, or you store stuff in UTC.
Scarblac
I disagree, store UTC time and the name of timezone it was originally recorded in so it can be translated back to that as well.
UTC only loses information.
SAI_Peregrinus
You conceivably need more than that. You need to know the location (NOT just the time zone) where it will occur, the time zone that location was in when the meeting was created, and the time zone that location is in when the meeting actually happens. The users' systems or whatever does final display need to know their own time zone(s), and how to convert.
Say, for a somewhat annoying-case example, you want to store a meeting date/time that's in the future, in January, in the city of New York, New York, USA, with remote participants elsewhere in New York state. Sometime in between when the calendar invite is created and the meeting the city of New York decides to change to be on permanent Daylight Savings Time, but the rest of New York state doesn't change. If you stored only the UTC time and "America/New York" you now have an ambiguous meeting date/time, since the "America/New York" time zone split and the city of NY is an hour off from the rest of NY for part of the year, and your remote participants could get the wrong time.
There's probably an even worse case involving the death of a Japanese emperor, since the "period" portion of a Japanese date is the imperial name of the emperor who ruled at that time, and that gets retroactively applied to dates between when the new emperor's coronation and when they took their new imperial name.
skissane
> and the name of timezone
The problem is that can be difficult to portably determine. One wishes POSIX had an API “give me IANA time zone name for current process” which would do the needful to work it out (read TZ environment variable, readlink /etc/localtime, whatever else might be necessary)… but no, you are left to do those steps yourself. And it works reasonably well if the TZ environment variable is set, but it most commonly isn’t; readlink of /etc/localtime works on macOS and some Linux distros… but others make /etc/localtime a regular file not a symlink, which makes it all a lot harder
And that’s POSIX. Then there’s Windows which is possibly the last platform to still use its own timezone database instead of IANA’s. Now, Unicode CLDR maintains a Windows-to-IANA mapping table… but you have to ship both that table, and maybe the IANA timezone DB too, with your app, and keep them updated
I really wish Microsoft would ship the IANA database with Windows, and the IANA-Windows mapping table too, and provide APIs to query them, and keep them updated with Windows update. The core Windows OS and existing Windows apps can keep on using the legacy Windows TZ database for backward compatibility, whereas portable apps could use IANA instead if they wish
mariusor
Why not store it as a time date in the original timezone then?
jgalt212
Yes, but whose timezone? The machine? The user? What about a machine to machine transaction?
sph
Pro tip #2: never ever rely on automatic parsing of dates. It's a lie and will corrupt your data.
Either use dedicated "from_iso8601" functions, or manually specify the format of the input string ("%Y%m%dT%H%M%SZ")
j1elo
The list of things that devs should "never ever" do grows by the day... turns out most things, even the simplest of things, even sometimes the most apparently trivial things (like naming of people, too) are in reality almost always much more complex and difficult to get right on the first go than expected.
But then discussion ensues about how programmers these days add libraries as dependencies for almost everything! :-)
I guess at some point a middle ground must be found.
dotancohen
> never ever use anything but ISO dates in UTC tz unless you're displaying it for a user in a UI.
Not good for storing future meeting times. DST switchover dates can change, and your tz-normalized date won't change with it.saltcured
Or if you really want to be future proof, store the geolocation so you can try to figure out the jurisdiction for any changing regulations. Maybe they didn't change the date of the switch, but changed the timezone boundary on the map.
But, then I guess we might need to account for fractured societies and actually store some kind of organizational code for which belief system the event author adheres to? :-)
mytailorisrich
To me that an UI/UX issue.
Internally everything is stored and handled in TAI (better than UTC as no discontinuity) and translated from/to something else for human consumption.
I.e. for instance your should have logic to figure out what TAI period corresponds to "next month's meetings" if that's what the user wants, which you apply immediately on user inputs and then forget about DST, time zones, etc. in the rest of the code and storage.
Another benefit is that if your user was in New York but is now in London it is trivial and well-constrained to adjust to local time.
hughw
Some use cases really do require the local TZ offset be saved. Transforming everything to UTC wipes out that information.
An engineer in the US reviewing industrial measurements logged in a plant in Asia from a variety of sources is definitely going to encounter lots of events recorded in local time. It would be maddening for that engineer to have to review and resolve events from different time coordinates, especially if they are doing the review months or years later. It's best to accept that reality and adopt local time as the standard. Then you must record the TZ offset per UTC in any new system you create.
bbojan
You mean you must record the timezone? Because the TZ offset can change throughout the year (e.g. due to daylight saving time).
thrdbndndn
How is slashes ambiguous if you use YYYY and in YMD order?
shreddit
Because Americans use slashes commonly in Y/D/M
skissane
YDM is very rare; according to Wikipedia, only found in Kazakhstan, Latvia, Nepal, and Turkmenistan, and even in those countries it commonly coexists with other formats.
I believe internationally that YMD and DMY are the two most common formats (and also the most logical), and the (much less logical) MDY comes third… YDM is a very distant fourth. MYD, DYM are theoretically possible but doubt anybody uses them
Actually, although MDY is by far most common in the US, I’ve seen US federal government forms that use DMY instead (although I imagine that’s rare, I’m sure you’ll find DMY US govt forms are greatly outnumbered by MDY ones)
shreddit
My apologies to all Americans, of course i meant M/D/Y, i just got it backwards
thrdbndndn
I'm not American but I lived in the US for 10 years.
I had never seen Y/D/M. I almost never see American use Day/Month order to begin with, let alone Y/D/M.
binarymax
No we don’t. I’m American and I’ve never seen this format.
hnbad
Wait, I thought Americans use M/D/Y not Y/D/M?
kace91
Uh, TIL. As a non American it was known to me that Americans use the YDM format but I had no idea slashes vs dashes carried meaning for that.
jfindley
The standard is... well, it IS indeed a standard, I guess you can't really argue that, but it's a very great deal more permissive than many people might hope or expect. https://ijmacd.github.io/rfc3339-iso8601/ is a wonderful illustration of some of the deeply silly time formats permitted by ISO 8601.
PokerFacowaty
I recently spent 2 hours on finding the bug, precisely because JS can't comprehend dates/times without a Unix timestamp underneath. I'd take a date from Postgres as e.x. "2025-05-24" and the first time somewhere deep in the package I was using, when JS encountered that, it needed to add a time (midnight, that's sane) and timezone (local time :) ). I was trying to use UTC everywhere and since the read dates had midnight of UTC+2 as the time, they were all a day behind in UTC.
Special shoutouts to the author of node-postgres saying the PG's date type is better not used for dates in this case.[1] I love programming.
[1] https://node-postgres.com/features/types#date--timestamp--ti...
normie3000
> it needed to add a time (midnight, that's sane)
Is it sane? Is midnight at the start of a day, or the end of it? I'd think noon would be less ambiguous, and significantly less prone to these timezone issues (although this may not be a benefit).
kccqzy
Indeed it is not sane. Languages should provide a separate Day type to operate on dates without times. Forcing everything to use dates with times and timezones causes bugs in applications that don't need times.
dumah
ISO 8601-1:2019/Amd 1:2022
Midnight at the start of the day: 00:00:00
Midnight at the end of the day: 24:00:00
bandrami
How did they manage to build the entire modern Web on a language without a standard library?
jasode
> on a language without a standard library?
Because the other languages & with bigger runtimes and more comprehensive standard libraries such as Java applets, Microsoft Silverlight, Macromedia Flash that were promoted for browsers to create "rich fat clients" were ultimately rejected for various reasons. The plugins had performance problems, security problems, browser crashes, etc.
Java applets was positioned by Sun & Netscape to be the "serious professional" language. Javascript was intended to be the "toy" language.
In 1999, Microsoft added XMLHttpRequest() to IE's Javascript engine to enable Outlook-for-web email that acted dynamically like Outlook-on-desktop without page refreshes. Other browsers copied that. (We described the early "web apps" with jargon such as "DHTML" DynamicHTML and "AJAX".) In 2004, Google further proved out Javascript capabilities for "rich interactive clients" with Gmail and Google Maps. Smoothly drag map tiles around and zooming in and out without Macromedia Flash. Even without any deliberate coordinated agenda, the industry collectively begins to turn Javascript from a toy language into the dominant language for all serious web apps. Javascript now had huge momentum. A language runtime being built into the browser without a standard library like Javascript was prioritized by the industry more than the other options like plugins that had a bigger "batteries included" library. This overwhelming industry preference for Javascript happened before Node.js for server-side apps in 2009 and before Steve Jobs supposedly killed Flash in 2010.
The situation today of Node.js devs using npm to download "leftpad()" and a hundred other dependencies to "fill in the gaps" of basic functionality comes from the history of Javascript's adoption.
normie3000
> other languages & with bigger runtimes and more comprehensive standard libraries such as Java
Java's Date standard lib was awful for 2 decades, so there's no guarantee that a big standard library is a good standard library.
sethammons
Even in a good standard library, oddities arise. Many praise Go for its standard library but then there is its time format that raises an eyebrow: 01/02 03:04:05PM '06 -0700
throwaway2037
I agree 100%. Thank goodness that Stephen Colbourne's JodaTime and later JSR-310 fixed all of that. The new date/time libraries are dream in Java.
n2h4
[dead]
yen223
The world where JavaScript has a robust standard library is one where there is only one browser vendor who gets to call all the shots regarding what the web looks like.
That is not a better world.
cjpearson
Possibly, although the fix for this particular issue (Temporal API) is available in the current release of Firefox as well as the preview release of Safari, but it's not in that dominant browser.
Xelbair
and that differs from chrome dictating standards how?
also this could've been handled easily by committee(ugh), or a 3rd party open source organization akin to linux foundation that just makes js library that all browser vendors use. Or making just a specification for JS handling things.
you know - like a lot of other languages are handled, including their standard library.
h4ck_th3_pl4n3t
Kind of funny that you think the open web still exists.
I mean, even Microsoft gave up and just went with Chromium, and they got the definition of almost infinite resources at their disposal.
Effectively if your website doesn't run in Chrome and Safari, it won't be seen by 99% of the market.
0points
> even Microsoft gave up
Ah yes, Microsoft, the defenders of the free world.
Y_Y
> That is not a better world.
Not a better world, just the current world.
aloha2436
Safari is too valuable of a platform for web developers to ignore, but otherwise yes it's the only real exception to the Chrome monopoly and still certainly much smaller in terms of absolute users.
johncoltrane
I want that world.
baq
Actually it is. Sorry.
Y_Y
There's even worse language language for this. It's absolutely essential for an modern e-commerce, but has no official standard or spec, many thousands of libraries but they're mostly poorly maintained and geoblocked. The language itself is just a hodge podge of other languages that were popular in the past, and as a result it's worse than Perl for having too many ways to do any single thing. The major dialects may be theoretically compatible, but in practice there's usually friction when you try to mix them.
[Ok that's enough, ed.]
null
VoidWhisperer
PHP?
Y_Y
Rather than spoil this tremendous piece of humour, I'll just give a rot13-encrypted version of the answer: Ratyvfu
bandrami
Since I'm here, one of my standard interview questions when I'm hiring is "how would you convert a Gregorian date to a Julian date?" (and I make clear there's no penalty for not knowing those terms, and explain them if needed.)
What I'm looking for is "there has to be a library function for that; I would look it up".
omneity
With a lot of motivation, grit, coffee and reinventing the wheel every couple of weeks.
crubier
An interesting related question could be: Why did all languages with standard libraries fail to become the lingua franca of the Web against JavaScript?
HPsquared
Path dependence. JS was in the browser.
hnbad
Because the alternative would have been VBScript (if Microsoft had won at the time).
JS was a compromise. It had to be sent out the door quick, it needed to look sufficiently like Java to not upset Sun who were trying to establish Java as the universal platform at the time while not being feature complete enough to be perceived as a competitor rather than a supplement. And it had to be standardized ASAP to pre-empt Microsoft's Embrace Extend Extinguish strategy (which was well on its way with JScript). That's also why it's an ECMA standard rather than ISO despite Netscape not having been based in Switzerland - ECMA simply offered the shortest timeline to publishing a standard.
I think what's more amazing isn't just how we managed to build the bulk of user interfaces in JavaScript but how we Node.js managed to succeed with ECMAScript 3. Node.js was born into a world without strict mode and without even built-in support for JSON: https://codelucky.com/javascript-es5/ - and yeah, ECMAScript 3 was succeeded by ECMAScript 5 not 4 because it took vendors 10 years to agree on how the language should evolve in the 21st century - not only did we build the modern web on JavaScript, we built a lot of the modern web on the version of JavaScript as it was in 1999! Even AJAX wasn't standardized until 2006 when Web 2.0 was already in full swing.
AlienRobot
It's called a web because of how it's held together.
dotancohen
It's called a web because it's full of bugs.
mrweasel
Dates in JavaScript is just a special kind of broken. Even with the more modern APIs for formatting dates it's just wonky at best.
thrdbndndn
After carefully reading the timeline, I think the most surprising part is that when Chrome switched again to defaulting to local time for date-only forms in 2015 (together with date-time form), someone complained it was a "breaking change", despite the fact that it was simply following the spec, and it even went so far that it eventually caused the spec itself to change, and now we’re stuck with the Frankenstein mess we have today.
By that, I don't mean to dismiss the importance of backward compatibility, but this case is particularly funny because:
1. It had already been changed multiple times, each a breaking change, so it’s not like this form of compatibility was ever seriously respected;
2. Having it behave differently from other "legacy forms," like the slash-separated version, is itself arguably a break in backward compatibility;
3. As noted in the article, it never worked the same between Chrome and Firefox (at this point) anyway, so it’s doubtful how impactful this "breaking change" really was, considering you already had to write shim code either way.
nordiknomad5
I am not sure why my brave browser does not console out the error as in the OP
console.log(new Date('2025/05/28').toDateString());
console.log(new Date('2025-05-28').toDateString());
console.log(new Date('2025-5-28').toDateString());
OutPut Below
Wed May 28 2025 debugger eval code:1:9 Wed May 28 2025 debugger eval code:2:9 Wed May 28 2025 debugger eval code:4:9
dust42
If your local timezone is GMT>=0, then you wont see it. A special form of a Heisenbug that you only see easily when located in the Americas, while when in Europe/Africa/Asia it is invisible unless you switch your local time zone to any GMT-X value.
nordiknomad5
ok thanks for the clarification :)
fimdomeio
On a broader note for the ones out there not familiar with this "timeless" classic:
Falsehoods programmers believe about time gist.github.com/timvisee/fcda9bbdff88d45cc9061606b4b923ca
vladde
In Sweden, we can write dates as 28/5-25
0points
As another swede, I dare you!
We swedes use standardized ISO 8601 dates such as YYYY-MM-DD as dictated by our excellent government and you find it in use in our social security number, government correspondence and mostly everywhere.
rf15
> YYYY-MM-DD as dictated by our excellent government
Same here in germany! ...Which is the reason why everyone ignores it in favour of the traditional format.
I love democracy, and also mountain-shaped temporal unit ordering ^ It's 28.05.2025 13:15.
Text-ordering by date is a nightmare because everything is first grouped by day-of-month, then month, then year! :)
dylan604
So dates have endianness?
Philpax
Even more fun when that includes date or time ranges; I've literally had to paste Swedish-style dates into Claude to help me parse what the intended reading is.
hnbad
I guess some people just want to see the world burn.
Y_Y
-20.6
raverbashing
Being able to does not mean that you should
Scarblac
Not 28/5 '25?
lifthrasiir
Ah, usual issues with any old enough language with multiple popular implementations. Enjoy your standards. (At least ECMAScript is a relatively well-thought standard...)
pif
I'm missing the usual suspects complaining about how browsers exploit undefined behaviour and break programmers' expectations.
alex-knyaz
What are best practices/tips on handling date and time everyone has in general? Every time it is a bit of a nightmare.
c17r
Best I've seen is from the venerable Jon Skeet: https://codeblog.jonskeet.uk/2019/03/27/storing-utc-is-not-a...
happytoexplain
The best advice is unfortunately to not use a generalized practice. E.g. never "just use UTC". Use UTC when it makes sense.
- Understand the semantic difference between a timestamp (absolute time) and clock/calendar time (relative time). Understand which one your use case uses. Don't use one to store the other.
- If the use case calls for a relative time, do not manually construct or edit the date. Use your platforms date-creation/modification APIs, no matter how unnecessary they seem.
- Understand what is inside your platform's date types at rest. Understand which of your platform's date APIs pull in environmental information (time/tz/locale), as opposed to only using the arguments you pass it. Understand that your platform's 'print/stringify' function may be one of those aforementioned functions. Misunderstanding this often leads people to say inaccurate things. E.g. say your platform has a Date object that stores an epoch-based timestamp. People may say "the Date object is always in UTC", when really the Date object has no time offset, which is not the same thing.
- Understand that if you pass a date around platforms, it might accidentally be reserialized into the same absolute time, but a different relative time.
- Understand that there is a hierarchy of use cases, where each one has more complex requirements:
1. "Create/modify" timestamps; egg timers. (absolute time)
2. Alarm clocks (same clock time always).
3. One-time calendar events (has an explicit, static tz; same clock time if the user changes its day or time zone; different clock time if the user's time offset changes)
4. Recurring calendar events (same as above, except don't change the clock time if the user's time offset changed due to DST, as opposed to a geographic change)
5. Recurring calendar event with multiple participants (same as above, just remember that the attached tz is based on the creator, so the clock time will shift during DST for participants in a place without matching DST rules).
Note that a lot of platforms nowadays have built-in or 3rd party packages that automatically handle a lot of the rules in the above use cases.
Finally, understand that all those little weird things about dates (weird time zones, weird formatting conventions, legislative time zone changes, retroactive legislative time zone changes, leap days, leap seconds, times that don't exist), are good to know, but they will mostly be accounted for by the above understandings. You can get into them when you want to handle the real edge cases.
divan
WAT [1]
hnbad
It's a good talk but it's an even better talk if you actually like JS and understand not only why those things are the way they are but also that you wouldn't ever do them in normal code.
poincaredisk
You wouldn't do them intentionally, but they're there waiting to bite you in the foot.
divan
Heartbreaking to hear that there are people who "actually like JS".
Hang on, slashes and year-month-day?
https://en.wikipedia.org/wiki/ISO_8601
Handed down by the ISO, The Great Compromise allows YYYY-MM-DD (or YYYYMMDD if you're in a hurry) but the version with slashes I'd find ambiguous and upsetting, especially early in the month.
The standard is good, and you can get it from `date -I`. Hell mend anyone who messes with the delimiters or writes the year in octal or any other heresy.