Skip to content(if available)orjump to list(if available)

What Killed Innovation?

What Killed Innovation?

78 comments

·March 25, 2025

janalsncm

Innovation in data visualization? From a purely utilitarian view, the purpose of data visualization is to present data in a way people can understand it. If you’re constantly changing the method of visualizing the same thing it’s harder to do that. Sometimes a bar chart is best.

As far as cool visualizations go (that are better served as nonstandard visualizations) there are two recent ones that come to mind:

https://youtu.be/TkwXa7Cvfr8 (Especially around 16:56)

https://bbycroft.net/llm

marginalia_nu

I'd also argue that even if all else is equal, a flashy visualization is worse than a conventional one, as you generally do not want to draw attention to the presentation if your aim is to convey information.

NooneAtAll3

I thought the youtube link would be this https://www.youtube.com/watch?v=SwIyd_gsGWA

borgdefenser

I love data visualization but it very much reminds me of shred guitar playing, something I also use to very much love.

What non-guitar players are complaining about the lack of innovation in shred guitar playing? It is just not something that non-guitar players really care much about. Good shred vs bad shred is all going to sound the same to the non-guitarist anyway.

0xbadcafebee

Innovation is never constantly increasing. It usually appears in bursts, and stops around the point that humans don't need it as much, or development hits a ceiling of effort. But it's always slowly simmering. Usually it's research or yak-shaving that, after years, suddenly appears as if out of nowhere as a useful product.

I am hopeful that in my lifetime, the web will die. It's such an insanely stupid application platform. An OS on an OS, in a document reader (which, due to humans' ability to go to any lengths to avoid hard work, literally all new network protocols have to be built on top of).

You want cool visualizations? Maybe don't lock yourself into using a goddamn networked document viewer. Native apps can do literally anything. But here we are, the most advanced lifeforms on the planet, trapped in a cage of our own making.

feoren

> An OS on an OS, in a document reader

Versus an interpreted language executed in a runtime running on a virtual thread of an OS running on top of a BIOS over a glorified calculator!? Insanity! Whatever happened to good old-fashioned pen and paper!?

There's nothing wrong with the model of delivering your software as a small program to run in a sandboxed browser environment. WASM, canvas, WebGL -- you can do nearly as much on the web as native nowadays, with a dead-simple deployment model. One of the only type of programs that's much harder to make in a web application is malware. Calling a modern browser a "networked document reader" is as silly as calling a modern computer a calculator.

collingreen

The DOM seems fair to call a networked document reader. You've suggested a different build target for what would have been native apps - I think you and OP meet in the middle a bit; you get the power of non-html app development. OP laments the overhead of having to shove that into the existing web model designed for documents; you appreciate the sandboxing.

I think you have similar opinions that mostly overlap, regardless of insults about statements being silly.

billyp-rva

> Native apps can do literally anything.

That's just as much a downside as an upside. You're putting a lot of trust in a native app that you aren't putting in a website.

treyd

We have sandboxing technology on every modern operating system.

nullpoint420

What about sandboxed native apps? If the browser can do it, why can't native apps do it as well?

ryandrake

> I am hopeful that in my lifetime, the web will die.

I'd like to see the www go back to its roots as a way to share and browse documents, hyperlinked together. The web worked when it was just documents to render and click on links. It is terrible as an application platform.

It's been 30 years since JavaScript was invented. Imagine what we'd have today, if instead of making the WWW into this half-assed application platform, those 30 years of collective brainpower were instead spent on making a great cross-platform native application development and delivery system!

sunrunner

The web as it was originally conceived - readable (but not interactive) content with linked resources - feels a far cry from the web of today, a platform for interactive applications that seems to grow asymptotically towards feature-parity with native applications (UI, input handling, data processing, hardware access) while never quite getting there, encompassing the fundamental things that make 'applications' work.

If the modern web _did_ reach feature parity in some way the real question would then be 'What makes it different?'. As linked resources doesn't seem like a particularly strong unique feature today the only other things I can think of are the simpler cross-platform experience and the ease of distribution.

So then the questions are 'What would make for a better cross-platform development experience?' (Chromium embedded framework not included) and 'How do we make app distribution seamless?' Is it feasible or sensible to have users expect to access every application just by visiting a named page and getting the latest version blasted at their browser?

And I guess that's how we got Chrome OS.

bobthepanda

The web gained traction as a development platform because for the most part, it broadly works the same on every device due to the web standards, and so it's very easy to develop something that works consistently on all the different devices. Purists may bemoan that things no longer respect the "native look and feel" but that is a feature, not a bug, for the vast majority of users and developers. As an example, I absolutely hate that my work email on Outlook does not have the same feature set on Windows vs Mac vs whatever, and even in scenarios where application developers want to deliver the same features everywhere the minutiae of the native development patterns make it like herding cats.

It is basically the electrical plug of our era, in that it is a means to an end, never mind if 110V 60Hz is necessarily the most efficient way to deliver power in the home in North America.

MaxBarraclough

We have JavaFX and Qt, and they're both better than ever, but they don't see much use. With JavaFX you can build and distribute a portable .jar file, and I think it can be used with JNLP/Java Web Start for distribution if you prefer that approach. With Qt, you're likely to be delivering self-contained native application packages in the target platform's native form.

(JavaFX has been carved out of the core JVM, which is annoying, but if the target machine has a JVM installed that bundles JavaFX, you're all set.)

scarface_74

Native for Windows, Macs, Linux, iPhones and Android devices?

Now imagine trying to update all of those native apps across a large enterprise or multiple large enterprises.

Since I do use multiple devices, when everything is on the web, you also don’t have to worry about syncing or conflict resolution like you do with semi connected scenarios.

ryandrake

> Native for Windows, Macs, Linux, iPhones and Android devices?

> Now imagine trying to update all of those native apps across a large enterprise or multiple large enterprises.

With the tools we have now, it would absolutely not work. In my post I was imagining a parallel alternate universe where native development tools got all the brainpower and innovation over the last 30 years, instead of the web tools getting it.

mjevans

In HN spirit / guidelines, I'm going to presume the best.

Did you mean: "the web (as an application platform) will die" / once again swing back from mainframe / thin client to powerful local computing platforms?

In the spirit of empowering the user, I too hope the average user once again owns their destiny, the storage, computation, and control of their data. Though I think the web as a publishing media does empower that user if there are open platforms that promote the ability to choose any fulfillment partner they desire.

ericmcer

The web is just a convention that gained rapid adoption so now browsers dominate software. As far as conventions go, it is not bad compared to some of the stuff humans have landed on. Better than paving over everything so we can drive and park cars all over, better than everything being single use and disposable. Web has it's ups and downs but it is decent based on our track record.

fumar

I am exploring an alternative browser-like platform concept that would allow for near native performance. However established web protocols are hard to overcome.

slt2021

>>Native apps can do literally anything

like hack your banking account or steal your password...

pphysch

Stagnation in viz design has pretty much nothing to do with the shrinking native<->web capability gap, and the web is here to stay.

Animats

> That was the year I realized I was experiencing scrollytelling fatigue.

She nailed it.

The people who really, really have to look at graphs of numbers all day have a Bloomberg terminal. The graphics are visually unexciting but useful.

praptak

Even a small amount of data literacy makes you aware that visualizations can deceive. Pie charts make humans overestimate large percentages, nonzero axis is borderline fraud, choice of colors can totally warp color scales.

I think that in this context it is expected for data literacy to make people suspicious of complex visualizations.

garciasn

Data literacy should come down to the data itself, not only the visualization of those data. Sure pie charts are the bane of Tufte’s existence but even the best data visualizations of a particular segment of data can be misleading due to misrepresentation of the data underneath from collection to its analysis.

People should be far more skeptical of what they are fed. Data narratives are often misleading with manipulation of the data, its aggregation, visualization, and especially the interpretation within context. Data literacy needs to address all of these, not simply the how it’s visualized; that’s simply the final step in the entire data and information lifecycle.

I’m not saying “do your own research;” instead, folks should think critically about what they’re seeing and attempt to understand what’s presented and put it inside the appropriate context before taking anything at face value that they’re shown, by any organization.

e: just formatting

roenxi

Pie charts are just as unreadable for medium and small percentages. They encode values as angles. Human perception is not suited to estimating angles relative to each other.

Analemma_

> nonzero axis is borderline fraud

This is an outrageously reductive meme that has long outstripped its actual usefulness and needs to die. The axis and scale should represent the useful range of values. For example, if your body temperature in Fahrenheit moves more than 5 degrees in either direction, you're having a medical emergency, but on a graph that starts from zero, this would barely be visible. Plotting body temperature from zero would conceal much more than it reveals, which is the opposite of what dataviz is supposed to do.

teddyh

The only reasonable zero-value for temperature is 0K, which unfortunately leads to unreadable graphs. (All other temperature scales are completely arbitrary.) So for the specific case of temperatures, it is in fact completely reasonable to have a nonzero axis. But most graphs are not temperatures.

matkoniecz

this is a very rare case where nonzero axis is justifiable

nevertheless >99% of cases where I am encountering nonzero axis it is misleading

> The axis and scale should represent the useful range of values

this should not be confused for "range of values present in data"

often actually useful visualization would show that value barely changed - but it makes for more truthful and boring news, so is avoided

tech_ken

The point of data presentation is to gist the most salient trends; an interactive chart where you can zoom in to the lowest granularity of the data basically defeats the purpose of the plot in the first place. Similarly most animation in charts doesn't really add any meaningful visual data, it's just distracting. I think most consumers of data journalism got pretty bored of scrolling through some massive viz after only a few minutes, and why would they not? People read the news to have the critical points surfaced for them. They don't want to dig through the data themselves (and if they do they're not going to be satisfied with the prebuilt animation). These kinds of things are IMO more fun and interesting to build, rather than to actually try and learn something from.

ralferoo

Just looking at that "512 paths to the white house graphic", and I'd argue that it's more confusing than useful. Why is Florida at the top? Consider the point where it's "Obama has 255 ways" and "Romney has 1 way". What's the point of the massive arrow to Florida and then taking a very specific route to success? This would only make sense if there is a pre-determined order in which the results must come.

The way it's been done in the past in the UK, for instance, is "A needs X more seats to win, B needs Y more seats to win, Z more seats remain". Simple, clear, and no flashy graphics required.

I know the situation in the US is a bit more complicated with different numbers of representatives per state, but it's still not especially useful to prioritise one state over another in the graphic, because what's important is the relative difference between the totals so far received.

I get that there could be some more presentation towards uncalled results and the expected outcome, but it doesn't look like that graph gives that, which would be far more useful than this thing with arrows.

phendrenad2

The US news covers the US elections from a really strange angle. They act as though even as the votes are coming in, and there is nothing more the candidates can do to change the outcome, that they are still "looking for a path to victory" and they list all of the "paths to victory" that could be possible. As though we're watching them stumble through a dark forest.

lxgr

I had the exact same thought here: https://news.ycombinator.com/item?id=43473149

Really bewildering from an epistemic point of view, even if it's "just a metaphor". (And do people really generally understand it to be just that?)

LegionMammal978

> Why is Florida at the top?

As you mention, the number of electors per state varies by quite a bit. E.g., in the 2012 election covered by the chart, Florida had 29 electors, Ohio had 18 electors, and North Carolina had 15 electors, which is why those three states appear at the top.

The main important effect is that (with only some small exceptions) if a candidate wins a simple majority of the votes in a state, then they receive all of that state's electors. E.g., if a candidate wins 50.01% of the Florida vote, they get 29 electors, but if they win 49.99% of the vote, they get 0 electors. See: the 2000 election, where the overall outcome depended on a few hundred votes in this way.

This means there's a lot of focus on 'flipping' states one way or the other, since their electoral votes all come in blocks. What the chart is showing is that if Romney won Florida, he could afford to lose a few other contested states and still win the national election. But if Obama won Florida (as he in fact did), then Romney would need every other state to go his way (very unlikely!) if he still wanted to have a chance.

That is to say, Florida really was extremely important, given the structure of U.S. presidential elections: it would make or break a candidate's whole campaign, regardless of what happened in the rest of the country. And similarly, the remaining states are ordered by decreasing importance.

Of course, while results are being counted, you also see simpler diagrams of the current situation. The classic format is a map of the country with each state colored red or blue depending on which way it flips. This is often accompanied by a horizonal line with a red bar growing from one side, a blue bar growing from the other side, and a line in the middle. But people are interested in which states are more important than others, which creates the imagery of 'paths to win'.

ralferoo

Except, in the extreme example I cited from the article: "Obama has 255 ways" and "Romney has 1 way"

At that point, Romney had to win every remaining state to win. Florida was no more important than any other state that still hadn't declared a result. Whatever one came next would determine the result.

I'd also argue that the point you're making is obscured by this image. There's no way of determining from that image how many electors each state contributes, just lots of arrows and a red/blue outcome. IMHO, how it was actually shown on news programs today is much clearer than what the article is proposing.

damnitbuilds

Correct title: What Killed Innovation in the Pretty Diagram field?

I keep seeing books with interesting titles like "The evolution of clothing" and then see a subtitle like "In Wisconsin. Between 1985 and 1986."

badc0ffee

"From jean vests to jean jackets"

Ragnarork

I'm not sure about this. Why do we constantly need new ways of presenting data?

My main concern is that eventually it becomes easy to read and interpret data, especially for people that are not used to it, that are less data- or science-savvy in a way. That it's accessible.

It's good to try to find better ways to present certain cases, but also it's only needed as far as it's useful, and otherwise I feel consistency is way better instead of keeping churning out new ways to look at it that require an effort on the consumer part (no matter how beautiful / well presented this is) to figure out what they want to know out of it.

Innovation for the sake of usefulness is good. Innovation for the sake of innovation feels... definitely not as good (although I wouldn't discard it completely).

nine_k

Have we already achieved the absolute optimal ways to visualize data? Maybe in some simple cases, yes, but not necessarily in all practical cases.

Should new and better ways to visualize data look drastically different from what we're used to? Maybe, but likely not very often. Revolutionary changes are rare, and incremental improvements are important.

mncharity

> So what next?

LLM discussion of visualizations?

I did guerrilla usability testing around teaching scale, which included this video[1] (stop-motion animation using CO molecules). Lots of people asked "What are those ripples?". IBM even had a supplementary webpage addressing this (which I no longer see, even on archive). People could easily ask this with me standing beside them, but not so much if viewing the content online. Which raised the UI question of how to encourage such.

With LLMs, perhaps people will be able to ask questions of a visualization? What is that? Why is that? What about ...? I don't understand ... Does this mean ...?

[1] IBM's A boy and his atom https://www.youtube.com/watch?v=oSCX78-8-q0 Making of: https://www.youtube.com/watch?v=xA4QWwaweWA

Avshalom

Unremarked is that while those examples are visually impressive, they're also unhelpful.

rqtwteye

Exactly. I see a lot of graphs and animations that look cool but when you take a closer look, they convey not much information .

fullshark

The economics don't support innovative web visualizations, a slight engagement boost for a day is the return on investment. If you're lucky it goes viral on social media, but there's far cheaper ways to accomplish that (e.g. inflammatory rhetoric).

jrm4

In this field? The answer is easy. Data, even pretty data -- maybe ESPECIALLY pretty data -- is not "information," and especially not "wisdom."

At the risk of using an odd term -- it's like -- "Data porn?"