Skip to content(if available)orjump to list(if available)

Why you can't color calibrate deep space photos

klysm

Recently I've been on a bit of a deep dive regarding human color vision and cameras. This left me with the general impression that RGB bayer filters are vastly over-utilized (mostly due to market share), and are they are usually not great for tasks other than mimicking human vision! For example, if you have a stationary scene, why not put a whole bunch of filters in front of a mono camera and get much more frequency information?

nothacking_

That's common in high end astophotography, and almost exclusively used at professional observatories. However, scientists like filters that are "rectangular", with a flat passband and sharp falloff, very unlike human color vision.

rachofsunshine

Assuming the bands are narrow, that should allow approximately true-color images, shouldn't it?

Human S cone channel = sum over bands of (intensity in that band) * (human S-cone sensitivity in that channel)

and similarly for M and L cone channels, which goes to the integral representing true color in the limit.

Are the bands too wide for this to work?

nothacking_

> Are the bands too wide for this to work?

For wideband filters used for stars and galaxies, yes. Sometimes the filters are wider then the entire visible spectrum.

For narrowband filters used to isolate emission from a particular element, no. If you have just the Oxygen-III signal isolated from everything else, you can composite it as a perfect turquoise color.

kabouseng

Yes off course, but with the obvious disadvantage that you lose resolution for every filter you add. Then you say let's just increase the pixel count, which means smaller pixel pitch. But then you lose low light sensitivity, have to decrease your lens f/#, so more expensive lenses etc... Which is why it isn't done for commercially / mass market sensors.

jofer

In case you weren't already aware, that last bit basically describes most optical scientific imaging (e.g. satellite imaging or spectroscopy in general).

adornKey

And don't forget about polarization! There's more information out there than just frequency.

chaboud

I think you want a push broom setup:

https://www.adept.net.au/news/newsletter/202001-jan/pushbroo...

Hyperspectral imaging is a really fun space. You can do a lot with some pretty basic filters and temporal trickery. However, once you’re out of hot mirror territory (near IR and IR filtering done on most cameras), things have to get pretty specialized.

But grab a cold mirror (visible light cutting IR filter) and a nighvision camera for a real party on the cheap.

cyb_

Having dabbled a bit in astrophotography, I would suggest that color is best used to bring out the structure (and beauty) of the object. Trying to faithfully match the human eye would, unfortunately, cause a lot of that data to be harder to see/understand. This is especially true in narrowband.

stavros

Plus, what's the point? It's not like anything will change if the object looks a bit more green rather than blue, it makes no difference to the wonder of the universe.

Retr0id

The next space mission should be to leave a colour calibration chart on the moon.

embedded_hiker

They brought a gnomon, with a color chart, on the Apollo missions. They would set it up for many of the pictures of samples.

https://airandspace.si.edu/collection-objects/gnomon-lunar-a...

pgreenwood

Here's a shot of a color chart on the moon from Apollo 17 (AS17-137-20900):

https://tothemoon.im-ldi.com/data_a70/AS17/extra/AS17-137-20...

jofer

The moon itself already is one. Moonshots are widely used in calibration, at least for earth observation satellites. The brightness of the full moon at each wavelength at each day of the year is predictable and well-known, so it makes a good target to check your payload against.

shagie

They also put color calibration charts on Mars rovers. For example https://www.lucideon.com/news/colour-standards-on-mars

JNRowe

There is even a Damien Hirst¹ haphazardly spread about the surface for that purpose.

One of the great gifts Pillinger² had was being able to shake up public interest via pop culture; there was also call sign by Blur for Beagle 2.

¹ https://www.researchgate.net/figure/Spot-Painting-Beagle-2-C...

² https://en.wikipedia.org/wiki/Colin_Pillinger

strogonoff

It is not just in space where nothing is lit by a uniform light source or with a uniform brightness. This is also true for many casual photos you would take on this planet.

Outside of a set of scenarios like “daylight” or “cloudy”, and especially if you shoot with a mix of disparate artificial existing light sources at night, you have a very similar problem. Shooting raw somewhat moves this problem to development stage, but it remains a challenge: balance for one, make the others look weird. Yet (and this is a paradox not present in deep space photography) astoundingly the same scene can look beautiful to the human eye!

In the end, it is always a subjective creative job that concerns your interpretation of light and what you want people to see.

HPsquared

I suppose the human visual system is already adapted to deal with the same problem.

jofer

These same things apply to satellite images of the Earth as well. Even when you have optical bands that roughly correspond to human eye sensitivity, they're a quite different response pattern. You're also often not working with those wavelength bands in the visualizations you make.

Scientific sensors want as "square" a spectral response as possible. That's quite different than human eye response. Getting a realistic RGB visualization from a sensor is very much an artform.

ekunazanu

> Because there’s a lot of overlap between the red and green cones, our brain subtracts some green from red, yielding this spectral response:

No, cones do not produce a negative response. The graph shows the intensity of the primaries required to recreate the spectral colour at that wavelength. The negative implies that the primary was added to the spectral colour to match it with itself, instead of adding it with the other primaries.

https://en.wikipedia.org/wiki/CIE_1931_color_space#Color_mat...

rf15

> No, cones do not produce a negative response.

not what was claimed at all...

mystraline

The proper color of an image would be a multispectral radiograph similar to a waterfall plot for each point. Each FFT bin would be 100GHz in size, and the range would be over 1000THz. And in a way, that'd what a color sensor is doing at the CCD level too - collapsing and averaging the radio energy its susceptible to a specific color.

hliyan

I still haven't forgiven whoever made Voyager's first images of Jupiter's moon Io bright red and yellow, and The Saturnian moon Enceladus green.

ianburrell

Neptune was shown as deep blue for a long time, but it is really a similar color as Uranus, a pale greenish-blue.

jpizagno

As a former astronomer, this was a great post. (The website can use some post-90s styling however :> )

indy

That aesthetic is how you know you're on a good astronomy site

kurthr

What's the white point? Is it D65? Not when the sun isn't out.

klysm

I've always been confused by what the white point actually _means_. Since we are dealing with strictly emissive sources here, and not reflected sunlight, does the whitepoint even mean anything?

esafak

In a scene lit overwhelmingly by one approximately Planckian light source, the white point is the color of the closest Planckian light source.

If the light source is not approximately Planckian, or if multiple illuminants have different temperatures, a white point is not defined.

klysm

So in this case there is no sensible white point since there is no illuminant right?

dheera

It's worth noting that many NASA images use the "HSO" palette which is false color imagery. In particular the sulfur (S) and hydrogen (H) lines are both red to the human eye, so NASA assigns them to different colors (hydrogen->red, sulfur->green, oxygen->blue) for interpretability.

bhouston

> Many other cameras, particularly those with aggressive UV-IR cut filters, underespond to H-a, resulting in dim and blueish nebula. Often people rip out those filters (astro-modification), but this usually results in the camera overresponding instead.

Hmm... astrophotographers do not use cameras with UV-IR cut filters at all. For example, I owned a few of these:

https://www.zwoastro.com/product-category/cameras/dso_cooled...

They also generally do not use sensors that have Bayer filters. This also screws things up.

Instead they use monochromatic sensors with narrowband filters (either one band or multiple) over them keyed to specific celestial emissions. The reason for this is that it gets rid of light pollution that is extensive and bumps up the signal to noise for the celestial items, especially the small faint details. Stuff like this:

https://telescopescanada.ca/products/zwo-4-piece-31mm-ha-sii...

https://telescopescanada.ca/products/zwo-duo-band-filter

Often these are combined with a true color capture (or individual RGBL narrowband) just to get the stars coloured properly.

Almost everything you see in high end astrophotography is false color because they map these individual narrowband captures on the monochrome sensors to interesting colours and often spending a lot of time manipulating the individual channels.

This is done at the medium to high end using the PixInsight software - including by NASA for the recent James Webb images: https://www.pbs.org/video/new-eye-on-the-universe-zvzqn1/

The James Web telescope has a set of 29 narrowband filters for its main sensor: https://jwst-docs.stsci.edu/jwst-near-infrared-camera/nircam...

Hubble pictures were famously coloured in a particular way that it has a formal name:

https://www.astronomymark.com/hubble_palette.htm

(My shots: https://app.astrobin.com/u/bhouston#gallery)

recipe19

What you're describing is the domain of a very, very small number of hobbyists with very deep pockets (plus various govt-funded entities).

The vast majority of hobby astrophotography is done pretty much as the webpage describes it, with a single camera. You can even buy high-end Canon cameras with IR filters factory-removed specifically for astrophotography. It's big enough of a market that the camera manufacturer accommodates it.

bhouston

> What you're describing is the domain of a very, very small number of hobbyists with very deep pockets

Sort of. The telescope used for the Dumbbell nebula captures featured in the article was at worth around $1000 and his mount is probably $500. A beginner cooled monochrome astrophotography camera is around $700 and if you want filters and a controller another $500.

There are quite a few people in the world doing this, upwards of 100K:

https://app.astrobin.com/search

Various PixInsight videos have +100K views: https://youtu.be/XCotRiUIWtg?si=RpkU-sECLusPM1j-&utm_source=...

Intro to narrowband also has 100K+ views: https://youtu.be/0Fp2SlhlprU?si=oqWrATDDwhmMguIl&utm_source=...

looofooo0

Some even scratch of the bayer pattern of old cameras.

tecleandor

You don't need very big pockets for that.

Today you can find very affordable monochromatic astrophotography cameras, and you can also modify cheap DSLR cameras or even compact cameras to remove its IR/UV/low pass filters. You can even insert a different semi permanent internal filter after that (like a IR or UV band pass)

I've done a Nikon D70 DSLR and a Canon Ixus/Elph compact.

Some cameras are very easy, some very difficult, so better check first some tutorials before buying a camera. And there are companies doing the conversion for you for a bunch of hundred dollars (probably 300 or 400).

looofooo0

You can even do the conversion diy.

tomrod

And the entire earth observation industry, which doesn't look the same way but uses the same base tech stack.

verandaguy

    > astrophotographers do not use cameras with UV-IR cut filters at all
I'll be pedantic here and say that the author's probably talking to people who use DSLRs with adapter rings for telescopes. I've been interested in doing this for a while (just unable to financially justify it), and I think this is actually something people in this niche do.

Then there are things like the Nikon D810A, which remove the UV-IR filter from the factory (but IIRC retain the Bayer filter).

bhouston

My recommendation, as someone who started with a DSLR and then modded it to remove the UV-IR filter, I would have been better to just skip to a beginner cooled mono astrophotography camera, like the ASI533MM Pro. It is night and day difference in terms of quality and roughly the same cost and it automates better much better.

A high end DSLR is a huge waste of money in astrophotography. Spend the same amount on a dedicated astrophotography camera and you’ll do much better.

schoen

> It is night and day difference

Particularly high praise in astronomy!

verandaguy

How do you recover colour from a mono astro camera? Just run it for 3 exposures behind a gel of each of the R/G/B colours, then comp?