Skip to content(if available)orjump to list(if available)

Google did not unilaterally decide to kill XSLT

spankalee

People have had some huge understandings of what's actually going on:

- That Mason opening issues means that it's a Google-effort. It's not.

- That the "Should we remove..." issue for community feedback. It's not. Spec issues are a collaboration vehicle for spec maintainers. There's not enough of the community on GitHub for that to be a good feedback mechanism.

- That Mason or Google hide comments and locked the thread. I heard from good authority that it was Apple employees actually, in their role as spec repo admins.

- That Google brought up the idea. The best I can see from meeting minutes is that a Mozilla rep did this time, though it's been brought up occasionally for 10 years at least.

- That the spec PR will be merged. At this point the PR is to show what it would mean to move XSLT from the spec.

- That decision has been made. These things are the beginning of the process.

- That XSLT even can be removed. Even though the vendors are tentatively in support, they are fully aware that this might not be viable in practice. I would guess that they think they can remove it, but they don't know for sure. They know usage numbers aren't always accurate, and they have ways of hedging bets like flags with different default in different channels, enterprise policies, reverse origin trials, etc.

throw7

My jotted down notes:

1. all major vendors (google, mozilla, webkit) want to remove xslt

2. chrome does not have resources to support xslt

3. removing/disabling xslt will be a slow methodical process. don't panic.

4. when opening a proposal change, a pull request of code changes is mandatory to show exact changes; it is not a "countdown to merge"(sic)

5. info that leaks to the public should include context or links to full context

6. removing xslt support in browsers is not good or bad, but "it depends"

dkiebd

3. It will eventually be removed. Does it matter whether it will take three months or three years? Since I suppose none of the browser vendors will give developers money to change their xslt usage in codebases for something else.

5. Funny that we are talking about "info that leaks to the public" when we are discussing standards that may be important to billions of people, as if keeping things private was reasonable.

magicalist

> Funny that we are talking about "info that leaks to the public"

It's a poor choice of words by the GP. This was a public discussion, what would be private about it?

Rather it "leaked" from people with shared context to people without it. The point of the article is that since the discussion is public, there will be people that come across it without context, so it would be a good idea to include context in these kinds of discussions in the future:

> If a removal discussion is going to be held in public, then it should assume the general public will see it and provide enough context for the general public to understand the actual nature of the discussion.

goyagoji

I find it bizarre. I think we obviously we want to be able to run pages from 2015 far in the future but certainly for a few more years.

As a browser maker, why would you even put this work in for cordinated processes instead of investing in a way to patch away your native code and do that continuously at a slow pace for every aging feature?

basscomm

> Does it matter whether it will take three months or three years?

It does!

I run a small hobby site built with XML and XSLT because I'm not a great programmer, but XSLT is something I can actually wrap my head around and use without too much fuss. If support goes away I need to know how much time I have to rewrite/migrate my site to something else.

sugarpimpdorsey

> Does it matter whether it will take three months or three years?

Do you think it matters to the guy that said he has an entire factory with IoT machinery that uses XSLT?

Should they shut the factory down?

meepmorp

> Since I suppose none of the browser vendors will give developers money to change their xslt usage in codebases for something else.

Let's turn that around: are you willing to pay a browser vendor to keep supporting xslt so you can keep your codebase unchanged?

danaris

How could #2 possibly be true without it being a deliberate choice on Google's part? They have staggering, absurd amounts of money. If they needed more resources allocated to Chrome, they could just do that.

andybak

This is addressed directly in the linked article. "Google" and "the Chrome team" are not the same entity.

dang

Related ongoing thread:

Should the web platform adopt XSLT 3.0? - https://news.ycombinator.com/item?id=44987552

Recent and also related:

XSLT removal will break multiple government and regulatory sites - https://news.ycombinator.com/item?id=44987346 - Aug 2025 (99 comments)

"Remove mentions of XSLT from the html spec" - https://news.ycombinator.com/item?id=44952185 - Aug 2025 (523 comments)

Should we remove XSLT from the web platform? - https://news.ycombinator.com/item?id=44909599 - Aug 2025 (96 comments)

cosmic_cheese

I hope some form of include makes it into the HTML standard and popular browser implementations before XSLT is gone. It’s perhaps the single largest gap in HTML’s capabilities and could reduce need for SSR, JS, and a build step in a lot of circumstances. Until now XSLT has been able to fill that gap for those who need that capability, but if it’s going away…

nashashmi

Remember shtml which was supposed to be server side include? It never made it to the web browser.

spankalee

cosmic_cheese

Interesting. Not a bad proposal, but am I reading correctly that partials from a different URL (“includes”) is a someday thing rather than part of the initial spec?

Either way it’d allow several types of sites to have zero dependencies and no build step which is pretty cool.

samdoesnothing

There is no point of introducing yet another standard to do something that can be done in few lines of JS.

   customElements.define(
     "html-include",
     class extends HTMLElement {
       connectedCallback() {
         fetch(this.getAttribute("href"))
           .then((response) => response.text())
           .then((text) => (this.innerHTML = text))}});

The browser has had a powerful scripting language built in as a core primitive for decades, and yet people would rather create more standards to avoid using it for ideological reasons, and then complain that there aren't enough competing browsers.

cosmic_cheese

That’s not nearly as nice as being able to drop a…

   <include src="/shared/header.htmp”>
…in anywhere you want a header. It being part of HTML also allows engines to optimize in ways that otherwise wouldn’t be possible and implement user-toggleable features like lazy loading. It can get better as browsers get better and is less likely to break than any custom JS I write.

nashashmi

The point is to take JS implemented features and make them native to the browser.

samdoesnothing

That's a terrible point.

naniwaduni

It's something that people keep reinventing on both sides of the connection. That XSLT is the better of two terrible ways to do it, and that you're making the argument that the other one still exists, is an embarrassment.

layer8

We should drop HTML and CSS files as well, because you can just use JS to create DOM nodes and set styles on them.

6510

One of the greatest features of pdf is that js support is unreliable.

sugarpimpdorsey

> yet people would rather create more standards to avoid using it for ideological reasons

Uh the standard in question has existed for over two decades

bevr1337

A/B testing the removal of a browser standard gives me pause. Is there precedent for that?

simonw

Yes, they've done it before. Some example:

onunload: https://developer.chrome.com/docs/web-platform/deprecating-u...

Mutation events (replaced by MutationObserver): https://developer.chrome.com/blog/mutation-events-deprecatio...

FTP URLs: https://developer.chrome.com/blog/deps-rems-95#ftp_support_r...

Here's useful documentation on their process, which they sometimes call Chrome Variations and sometimes Origin Trials: https://developer.chrome.com/docs/web-platform/chrome-variat...

th0ma5

They may be talking about actually removing whole standards not just specifics of standards like this. FTP is close but it was always adjacent to web browsing. Perhaps HTTP 1.0 removal is closest, but is a different level of abstraction than XSLT which is potentially and subtly everywhere.

bawolff

SameSite by default cookies come to mind: https://www.chromium.org/updates/same-site/ (and to be clear, this is a removal of functionality. The "feature" is to make cookies not work in certain contexts)

Outside of web browsers, i think X11 is somewhat famous for this sort of thing.

EmuAGR

They removed JXL support just when it was increasing popularity so it didn't interfere with their AVIF plans.

pdntspa

Wouldn't it be better if all interested parties band together and put forth a XML+XSLT-to-HTML translation with a common interface, and then integrate that?

6510

I have lots of ideas for things that can just be removed.

We could remove framesets and remove var from javascript. Remove the Date object now that there is a Temporal api. Remove tables and flexbox now that there is grid. xhr can go too now that we have fetch. The <center> tag isn't needed anymore. I'm sure we can find support from people disliking onclick and onsomething attributes. Or how about hoisting? Surely we can simply rm that? Removing things doesn't have to be limited to older things. asm.js and web workers weren't really necessary at all.

I'm fountain of good ideas.

ameliaquining

The idea is not to just remove things for the sake of removing things, but to engage in a cost-benefit analysis. Most of the things listed above either are very widely used or don't cost much or have other downsides to maintain. By contrast, when things have been removed from the Web platform in the past, it was because they were causing problems out of proportion to the amount of breakage induced by removing them. In the case of XSLT, the problem is the attack surface that it adds.

nashashmi

The people who use these features are busy using these features. And they are not part of browser development. So they revolt in a nasty manner. Like when ftp was torn down.

It is nice to see workarounds. But those workarounds are not conducive to HTML purists who do things without JS. They are the real web developers. They have always relied on the browser to improve and become faster but not start abandoning old technologies.

Chrome OS also became popular on this point that a browser can do things like being universal viewers and so the need for programs goes away. There are so many lite OS who are also using the browser to do everything.

Now I understand that the web has failed XML and XML failed the web in favor of JSON. I also whole heartedly believe that XML and XSLT can do so much more for the web and do this natively.

But open systems are not in the interest of the big FAANG and Microsoft ecosystem. They abandoned RSS. They abandon APIs on a regular basis. And this turn of events is causing browser vendors to start developing for big companies rather than open indie developers.

There is much gain from XML and XSLT. But I want to see a specific development. I want to see XSL import an XML. I want to see the reverse. XSL will be the view. XML will be the model. And the browser will be the controller. MVC paradigm.

oorza

> HTML purists who do things without JS. They are the real web developers.

I don't think using one set of technologies as compared to another one can really be said to make one a "real" web developer. Real web developers are developers who put sites on the web, there is no benefit to anyone to be had claiming one choice is "real" and therefore the other choices are lesser-than.

Put it this way: whatever set of constraints you used to arrive at that decision does not apply to every situation, and when you frame things through the lens where you implicitly disregard that oh-so-obvious truth, it's hard for anyone to interpret your analysis as anything but myopic in the best case and actively self-serving and destructive in the worst case. It's nearly impossible to read through someone speaking this way about a topic and believe their analysis is objective, comprehensive, or without obvious bias, even if it may actually be all of those things.

karlgkk

> Like when ftp was torn down

FTP needed to be torn down. It’s sad, but true. There was no reason for anyone to be using it past 2010 - and in fact many reasons actively against using it

nashashmi

It was a quick way to share files. FTP as an internet application made a lot of sense. The only counter reason was it did not have password protections like SFTP did, and did not support encryption. But that is like arguing against HTTP in favor of HTTPS.

I don't think they put SFTP in browsers yet.

ocdtrekkie

I have yet to find a website which pleasantly lets me download a bunch of things in an efficient and organized manner that compares with FTP. The hilarity of buying a Humble Bundle and pressing "download all" and watching your browser spam "save as" windows for a minute and a half...

I kinda wish I could (S)FTP a lot more things than I can today.

bawolff

Wget? Like this isn't a protocol issue but a website issue. You can do this sort of thing with http just fine.

spankalee

FTP was not torn down just because some web browsers stopped also being FTP clients. You can still use any FTIP client with any FTP server that you want.

nashashmi

But it was torn down from the browser. That’s what we are talking about with respect to XSLT

worble

I do not understand how this can possibly be considered. It comes down to one question: will this break websites that are in production right now?

If the answer is yes then it can't be done, simple as. I really don't care about anything else, you can't just break the web for people who are actively using it.

spankalee

Removing Flash, Mutation Events, and third-party cookies broke websites.

danaris

I mean, of course they can. They're Google. What are you going to do, stop using Chrome and searching with Google and using Gmail and buying AdWords and....etc?

This is what happens when you have a monopoly.

scotty79

I hope that if they decide to update XSLT they'll also implement this:

https://en.wikipedia.org/wiki/Efficient_XML_Interchange

scrollaway

Fun fact - A very old version of the "WoW Armory", which was Blizzard Entertainment's World of Warcraft database at https://wowarmory.com/ used XSLT for all of its styling - though, IIRC, only on Firefox which was the only web browser to properly implement it.

The website's "item" and "character" pages were served entirely in XML.

Here's a web archive: https://web.archive.org/web/20080220170805/https://wowarmory...

(dead_dove.gif)

paulsutter

Is there any reason this cant be solved with a proxy server? So that legacy software that uses XLST can still run on the new browsers that lack it?

XLST is a really weird feature and it seems sensible to drop it, be nice if there is a transparent solution for old software that uses it.

basscomm

> Is there any reason this cant be solved with a proxy server? So that legacy software that uses XLST can still run on the new browsers that lack it?

With XSLT in a browser I can throw some static XML and XSLT files on any web server and they will Just Work™ and be usable without me having to do much other than telling the web server to serve index.xml instead of index.html. If I have to learn how to set up and maintain a proxy server so visitors to my site can still view the rendered pages, then I probably won't bother.

> XLST is a really weird feature and it seems sensible to drop it

What's weird about it? It lets me mark up some text using whatever XML makes sense to me and then write a template that the browser uses to transform it into (X)HTML that it can display. For someone who builds basic sites, it's an easy way to do templating that doesn't involve me setting up a 'real' programming environment

pjmlp

Remember similar arguments were made not to support WebGL 2.0 Compute, also from Chrome team, because WebGPU was going to be much better.