Skip to content(if available)orjump to list(if available)

Adobe Lightroom's AI Remove feature added a Bitcoin to bird in flight photo

pclmulqdq

I have to say that I think the photo without the Ligthroom processing actually looks better. The second one hasn't just added a bitcoin, it has also added the "AI shimmer" that seems to be a part of a lot of generated images. I can't put my finger on exactly what the characteristic is, but my first instinct (separate from the bitcoin, which was hard to find) is "that's an AI picture." Someone should just spend some time in the blur tool if they don't like those glints.

serviceberry

I don't think there's any AI-fication going on in that photo. The modified version has a more compressed tone curve to bring out more detail, along with jacked up saturation (especially evident for water). This is similar to what most cell phones do by default to make photos look more pleasing.

I do agree that the original looks better, but the author of the post clearly prefers the modified version.

strogonoff

Clipped highlights in digital photography are simply impossible to eliminate in post-processing without conjuring nonexistent information. Even if you shoot raw. Different raw processors use different tricks, color propagation and such, but naturally they are best used when highlights are really small. I would not be surprised if tools like Lightroom invoke ML at the first hint of clipping (because why not, if you all you have is a hammer…).

Pro tip: Digital sensors are much less forgiving than negative film when it comes to exposing highlights. With a bit of foresight they are best tackled at shooting time. Highlights from water/glass reflections are tamed by a fairly cheap polarizing filter, and if you shoot raw you should do the opposite of negative film and always underexpose a scene with bright highlights (especially if highlights are large or are in your subject of interest). Let it be dark, you will have more noise, but noise is manageable without having to invent what doesn’t exist in the parts that are the most noticeable to human eye.

serviceberry

> Clipped highlights in digital photography are simply impossible to eliminate in post-processing without conjuring nonexistent information. Even if you shoot raw.

Huh? This used to be true twenty years ago, but modern sensors in prosumer cameras capture a lot more dynamic range than can be displayed on the screen or conveyed in a standard JPEG. If you shoot raw, you absolutely have the information needed to rescue nominally clipped highlights. You get 2-4 stops of latitude without any real effort.

The problem here is different. If the rest of the scene is exposed correctly, you have to choose one or another. Overexposed highlights or underexposed subject. The workaround is to use tone mapping or local contrast tricks, but these easily give your photos a weird "HDR" look.

DrewADesign

nah-- even in the supplied jpg the histogram shows they're not clipping the highlights, and if you crank down the brightness, you can see the detail.

pclmulqdq

Oh, yeah, compression of the dynamic range combined with increase brightness makes sense. It's not exactly just stable diffusion that produces that look, but also things like ML filters, etc.

ruraljuror

This reminds me of the soap-opera effect[0] on modern tvs. I have difficulty watching a movie on someone’s tv with it enabled, but they don’t even seem to notice.

0: https://en.wikipedia.org/wiki/Soap_opera_effect

johnnyanmac

A truly bizarre effect. One of the first times in my life I was ever thinking "wait, this looks TOO smooth. it's weird". As if my eyes just instinctively knew there were fake frames (before I understood the concept of "frames").

samsartor

That could be the VAE? The "latent" part of latent diffusion models is surprisingly lossy. And however much the image is getting inpainted, the entire thing gets encoded and decoded.

Edit: I'll note some new models (SD3 and Flux) have a wider latent dim and seem to suffer from this problem less.

AI generated images are also biased strongly towards medium lightness. The photographer's adjusting of the tone curve may simply give it that "look".

AuryGlenz

You absolutely do not need to encode and decode the whole image, even in ComfyUI. All you need to do is composite the changed areas back in the original photo. There are nodes for that and I’m sure that’s what Adobe does as well, if they even encode in the first place. These tools don’t really work quite like inpainting. There’s no denoise value - it’s all or nothing.

I’ve used Photoshop’s generative fill many times on singular images and there’s no loss on the ungenerated parts.

Culonavirus

If you zoom in and squint your eyes, it does look like some kind of shiny coin.

What I'd like to know though... is how is the model so bad that when you tell it to "remove this artifact" ... instead of it looking at the surroundings and painting over with some DoF-ed out ocean... it slaps an even more distinct artifact in there? Makes no sense.

samsartor

A lot of current inpainting models have quite a lot of "signal leak". They're more for covering stuff vs removing it entirely.

Ironically, some older SD1/2-era models work a lot better for complete removal.

AuryGlenz

I mean, this is notable because it screwed up. It usually does a pretty good job. Usually.

In this case there are better tools for the job anyways. Generative fill shines when it’s over something that’d be hard to paint back in - out of focus water isn’t that.

gedy

Heaven forbid your picture has a woman in it somewhere though, Adobe's AI will refuse to fill half the time.. I've taken to censoring out blocks of image with black squares of it has any body parts showing (still clothed), fill, copy, then undo the censoring. It's pretty ridiculous for a paid subscription.

kyriakos

For a paid product even if the content explicitly contained nudity or depicted sexual activity it should had still been allowed as they are valid cases that Lightroom and Photoshop could be used. The censorship in AI is stupid, babysitting users should not be part of the tool's responsibility. Its like banning kitchen knives to keep people from using them for violence.

ripped_britches

“Large company ships half baked product” has gotta be the least interesting story to read

slg

Sure, if you view this as an isolated incident. But I think of it more as the latest example of the larger trend of how the industry has gone mad actively making their products worse with half-baked AI features. That is a more interesting story.

DrewADesign

And this is the closest thing the professional imaging world has to a broadly-available tool designed for high-end professional use cases. It's barely consistently good enough for throwaway blog headers in its current state for large edits, and for small edits it's exactly 0 percent better than the removal tools they added 20 years ago. Adobe better start giving some love to its professional users because the alternatives are getting better, and their ecosystem is getting worse. It's like they're trying to put themselves in a position where they're competing with Canva and Instagram rather than Affinity, Procreate, Rebelle, etc. If it's not clear to them yet that they're not right-around-the-corner from having their AI tools be a drop-in replacement for their regular stuff, they're gonna have a bad time.

thegeomaster

Is it actively worse, though? My impression is that all of the other, classical-based in-painting methods are still alive and well in Adobe products. And I think their in-painting works well, when it does work. To me, this honestly sounds like an improvement, especially in a creative tool like Lightroom or Photoshop --- the artist has more options to achieve their vision; it's usually understood to be up to the artist to use the tools appropriately.

ryandrake

I'm not an artist or an Adobe customer, but when I see products adding half-baked or defective features, it tarnishes their brand and would definitely make me re-consider trusting their product. Especially for professional use, and regardless of whether the rest of the product still works fine. It's a general indicator of carelessness and tolerance of quality problems.

DrewADesign

Unfortunately, the Adobe ecosystem on a whole is like OS-level complex, and they're pretty much ignoring anything that isn't directly related to generative AI, and that stuff is only consistently good for the lowest-level use cases. Miles more useful than Comfy, etc. for most artists and designers, but not close for people that need to do more skillful work. The value of Adobe as an ecosystem was their constant upgrading and keeping up with the times, and now, if it's not some cockamamie generative AI feature, it's going nowhere. They're even worse with bug fixes than they were before.

johnnyanmac

THe fact that we call it least interesting shows exactly how interesting it is that we just accept that companies are expected to ship broken slop.

ttoinou

I dont see where in the picture it is zoomed from to see the bitcoin

jxi

You have to click into it as it's not visible from the preview.

crooked-v

Bottom left.

permo-w

it's in the second picture not the first

uberman

Down from the tip of the birds right wing near the very bottom of the image.

missing-acumen

Question to people knowing adobe lightroom, could this feature be compromised? Is this just doing API calls to some remote thing?

nshireman

Lightroom has a local Heal/Remove feature, and at least with LR Classic you have to tick a box for the AI remove, which processes it on Adobe servers.

As for whether it can be compromised... Probably? It sends all or some of your photo to a remove server, so that can certainly be taken.

missing-acumen

I mean, having the model behave this way looks too easy and I guess that adobe does qc on the features it releases, so I'm not sure to see an alternative explanation - or adobe's qc is poor/inexistent.

wmf

I'm not sure what you mean by compromised but I'm pretty sure Adobe Firefly AI features are server-based. These features are too good to be done locally.

jsheard

Plus even if it could be done locally, doing it server-side has the side benefit (for Adobe) of making it trivial to prevent pirates from ever being able to use those features.

missing-acumen

By compromised I mean something like someone having access to adobe's servers where this is running and uploading troll models or toying with the model's responses

IAmGraydon

It’s almost like integrating a poorly understood black box with your software is a bad idea.

jdoliner

It's telling you what it's mining in the background with those extra gpu cycles.

benoau

They'll just add a disclaimer somewhere.

throwaway81523

I had to look up "butlerian jihad" (from one of the bsky comments) and now I want one too. Yow.

null

[deleted]