Sim Daltonism: The color blindness simulator
21 comments
·March 28, 2025oniony
politelemon
That sounds more useful, cheers. I wonder why they haven't made it a non hidden option.
OJFord
It's annoying enough (especially as a child at school, but not at all exclusively) to be colourblind and put up with 'so what colour is this' without people waving their phone around exclaiming about how it says you see things, thanks.
cwillu
Sure, but at the same time, it's a useful tool to show people how unusable their ui is using something they already have.
The problem with “what colour is this? what colour is that?” is not the question, it's that the question comes up with an expectation of an answer, regardless of the context. If I'm _never_ willing to answer the annoying question, that makes me the asshole regardless of how poor my colour vision is.
bluechair
I’ll highlight a note from the developer:
Sim Daltonism lets you see through the eyes of someone with a color blindness. While the colors shown are a good approximation of what a color blind person would see, you should not expect them to be perfect.
Everyone has his own perception of colors that differs slightly from other people, and color blindness are often partial at different degrees. More importantly, cameras do not have the same spectral response as cones in your eyes, so the simulation has to make some assumptions about the frequency composition of the colors.
I’m colorblind and haven’t found a simulator that comes close to what it’s like for me. This app doesn’t do it either.
adamgordonbell
I'm also color blind, red green, but not sure how you expected to be able to judge it.
You can't see how the app affects colors in absense of your own color blindness to compare.
egypturnash
What would “close to what it’s like” entail exactly?
Would it mean that when you look at a simulation of the effects of your colorblindness, you see zero change from the unaltered view?
Or would it mean that it looks absolutely nothing like what you see because it’s transforming the base image by clamping the input colors to what you can see, and stretching that decimated color space out over the entire range of normal sensitivity?
Sometimes I suspect that the range of color qualia the human mind experiences is the same regardless of what actual color receptors one has; the sensation we call “red” is assigned to the lowest end of the input scale, regardless of whether or not the lowest end is at the normal wavelength, and that every filter that just removes color and provides a duller image is doing completely the wrong thing. But it’s a much simpler transformation to implement.
(I think the key to checking this would involve violently clashing colors. Or a way to make someone start growing new cone cells in their eyes.)
Also if you have had entirely too many conversations with the normies about “what does it look like for you” then please just ignore this, my SO is partially colorblind and gets that a lot!
swiftcoder
> Would it mean that when you look at a simulation of the effects of your colorblindness, you see zero change from the unaltered view?
Ideally, yes. Although it's unlikely to match any one person's exact colour vision.
If you look at filtered images side-by-side, say from this collection on bored panda[1], to me the deutran images and the normal image are pretty much indistinguishable, while the protan image is close but slightly too green.
> Or would it mean that it looks absolutely nothing like what you see because it’s transforming the base image by clamping the input colors to what you can see, and stretching that decimated color space out over the entire range of normal sensitivity?
That's how most "colour blind filters" look in practice, yes. I don't think a lot of folks are setting up the transform correctly (or they are just straight-up using a colourblindness preview filter as if it were a colourblindness correction filter).
[1]: https://www.boredpanda.com/different-types-color-blindness-p...
Terr_
Part of it may be the display technology, rather than what the software thinks should look right.
Those RGB pixels are chosen and tuned to trick a certain homo sapiens baseline setup of chemical sensors neurological weighing of sensor inputs. Light from natural source is dramatically more-varied.
fisherjeff
I have always felt the same way - this article at The Verge is the only thing I’ve ever thought has gotten close:
https://www.theverge.com/23650428/colorblindness-design-ui-a...
rollcat
Tangential: I've been playing around with the color blindness filters on my iPhone, and the grayscale filter had me thinking for a moment. I've set it to 50%, set up the accessibility shortcut (triple-click on the home button) to toggle it, and found myself using my phone with the filter on basically 99% of the time.
It's been a couple of months, and I've noticed that the oversaturated colors were making me slightly agitated, somehow captivating my attention. I sometimes disable the filter to look at a particular picture, or to figure out a detail in some context where the colors are already desaturated. Now my only wish is that it was less linear, maybe like a compressor in audio - maintain detail until it starts approaching the ceiling.
It may be a good idea if you'd consider <thelightphone.com> but don't want to switch.
Also, Rob Pike: <https://commandcenter.blogspot.com/2020/09/color-blindness-i...>
Also: <https://duckduckgo.com/?q=plan+9+from+bell+labs+rio&ia=image...>
itishappy
I bet there's a way to accomplish what you're looking for with ICC profiles. They allow arbitrary functions and LUTs in addition to standard matrix math. There's typically a way to set this for your OS, an individual image, and I'm pretty sure an individual app as well (but I'd imagine only your own).
Edit: Actually, this may not be editable on iOS, but it is on macOS, Windows, probably Linux, and it looks like Android too.
anonymars
You can do this on Android as well (manufacturer results may vary...)
This is on Android 14, but I initially turned it on in an earlier version: 1. unlock the Developer options (you can search this; depends on OS version) 2. in Developer options, scroll down to "simulate color space", choose grayscale 3. back to main settings --> Accessibility --> advanced, there is an "accessibility button" option 4. set that to "color correction"
Now I have a small icon (looks like a person) that I can use to toggle monochrome. Indeed it was someone I heard on a train who had this turned on for iPhone as described above for the same reason ("lowers the dopamine response", he explained to the conductor) that intrigued me into looking into it
Samsung also has a decent automation infrastructure ("modes and routines"), so I set that up to automatically disable the color correction / grayscale in certain apps (camera, photos, maps, etc.)
DawsonBruce
Huge fan of this tool for working on GUI implementations, ensuring the color choices and contrasts make sense for users that see GUIs differently than I do.
egypturnash
This is such a useful tool, I constantly pop it up to check contrast in my art.
AprilArcus
Could this be used in reverse to correct for color vision disorders, e.g. by punching down greens and blues and punching up reds into the outer range of the P3 gamut?
lastdong
Kudos to the developer for creating such an optimized app! It's only 444KB for iOS.
machine_ghost
[flagged]
swiftcoder
It's not for you, clearly. But there's plenty of platform-specific software in the world, especially when they involve low-level system extension.
For anyone on Android, this is possible with Simulate Color Space option in the hidden Developer Tools menu. Amusingly it works with the camera too, so you can look around the real world with some sense of what it's like.