Sora might have a 'pervert' problem on its hands
92 comments
·October 25, 2025marcellus23
smcin
A good-faith reading of the sort of suggestion the author never made is for the blanket opt-in consent to allow other users generate images/videos of you to be segmented into separate consents for PG, adult, fetish etc.; also face/whole body. A very clear consent form that tells them upfront "If you consent to users generating fetish consent of you, here are some examples of what's allowed and forbidden".
> Is the author suggesting we should stop people from creating fetish content of purely AI-generated characters?
Presumably not, but she's farming outrage rather than suggesting any fix. In the above suggested setup, people could then generate fetish consent from the much smaller set of users who consented to have fetish consent generated from them. But then of course they might expect some royalties or revenue-sharing, or at least identification/attribution/watermarking so the depicted user could drive traffic to social-media. OpenAI is skirting around not just segmented consent but any concept of revenue-sharing (i.e. OpenAI wants to dip its toe into OnlyFans territory, but without any revenue-sharing or licensing deal with creators).
aleph_minus_one
> OpenAI is skirting around not just segmented consent but any concept of revenue-sharing (i.e. OpenAI wants to dip its toe into OnlyFans territory, but without any revenue-sharing or licensing deal with creators).
OpenAI is still doing basic experiments with which product offering are well received by users and/or work well and which are not. If some data provided by users (e.g. photos depicting the user) are seen to be very essential to the success of the AI-created content using this data so that OpenAI will likely loose an insane amount of money of these users leave (I think this is rather unlikely, but not impossible), then OpenAI will think about some concept of revenue-sharing, but not before (why should they?).
mslt
Pointing out that something feels a little creepy, while explicitly stating that you don’t yuck other people’s yum is hardly “farming outrage.” We collectively need to have earnest conversations about how how emerging technologies affect our experiences, and her tone is pretty middle-of-the-road.
quantified
Bow does other people doing things affect her, though?
fwip
I don't think she's passing any judgment here - she pointed out earlier in the article that she knows people are into weird stuff, and didn't want to "yuck their yum."
Jackson__
I don't think one gets to call these people 'pervert' in the title, then claim that you don't 'yuck their yum'. They yucked their yum before the article even started.
regularfry
Titles frequently aren't in the control of the article writer. Don't know if that's the case here or not.
TiredOfLife
> but surely there's nothing inherently wrong with using AI for fetish content.
It's taking work from Onlyfans, Artists who draw fetish content on commission (usually by copying other people art style), fanfic writers (who copy writing style, characters and setting of other people) and other organic and free range fetish content producers.
marcellus23
I see that position, but also, almost everyone (99% of people?) generating fetish content with AI would not otherwise be paying artists to commission that content.
casey2
Many girls believe that there is both inherent harm and external harm in the creation and consumption of sexual content. Arguing that AI content is inherently harmful is tricky, but can be replaced with yet more external harms.
whitexn--g28h
Many religious extremists, gender unrelated.
ivape
The problem is adults contribute to turning public platforms lewd. So one lewd person on Instagram leads to many, leading to a lewd platform. This becomes problematic when the children uptake it. It’s not really too different than prostitution appearing near general-purpose places, it turns it into a red light district.
A lot of social media is a sex platform, and it got mixed up in this way because there’s no talking adults out of being lewd in public.
tremon
Present-day adults have been raised in a society where overt sexualization has been the norm for decades. I see normalization of lewdness as a logical consequence of 60 years of sexualization in media, and the growing resistance against it as the normal swinging of the pendulum.
frumplestlatz
I wish we could talk adults out of it, but many decades on this earth have convinced me that’s just not going to happen.
mmooss
It's not easy or sure, but behaviors do change, dramatically sometimes. Youngish people have significantly less sex than before. People smoke a lot less in some countries, as one simple example.
Social media is still immature. We'll develop norms around what is appropriate.
vunderba
From the article
> How do you stop people from making fetish content of purely AI-generated characters that aren't cameos of real people? Does OpenAI want to stop that?
I'd say a resounding no. Didn't Sam Altman announce a little while back that they're exploring allowing ChatGPT to be used for erotica / NSFW generation?
blindriver
This is kind of a side issue but I think the time has come where videoing in public and using images of people without explicit consent should be outlawed. It used to be okay when the only medium was TV or movies and it was hard to distribute but now I think it has become a nuisance at best and dangerous at worst. The only thing I would make an exception for is actual journalists with some sort of credentials. But posting videos on social media of unwitting unconsenting people should be outlawed in the age where people with phones can upload videos in second.
afavour
In a world where absolutely everyone has a camera in their pocket I can’t see how you’d ever be able to enforce this.
lyu07282
Terrible argument, you should say because of generative AI where anybody can fabricate any image/video/audio of anyone in any context without their consent we urgently need new regulation of social media platforms to accommodate this profound change in our reality. Then make your argument.
You have no expectation of privacy in public spaces since forever, that is not a problem. Because nobody can photograph you stabbing someone and uploading it on social media without you actually stabbing someone. This is now different, because anyone now can make that photograph of you stabbing someone and post it.
That must be your argument.
And it must be on the social media side, because in X months some open model on GitHub is gonna make every watermark or cloud-based safety feature meaningless anyway.
null
cyberax
Just force Sora to avoid photos of real people. Instead, synthesize a "generic" face. This can be done with training, create a database of photos to avoid and train Sora on that.
thfuran
Is it actually easy to train the AI on all the faces so it can make pictures that look like humans while also training it to not make pictures that look like any specific human?
itake
the whole point of sora is you can generate photos of yourself and friends...
The app allows you to control if other people can generate photos of you. If the author doesn't want other people to make these photos, disable public video generation...
ungreased0675
Wait a sec, you have to specifically opt-out of this? OpenAI needs to be bankrupted for their massive copyright infringement.
vunderba
Sigh, it's in TFA.
> I've allowed anyone to make "cameos" using my face. (You don't have to do this: You can choose settings that make your likeness private, or open to just your friends — but I figured, why not? And left my likeness open to everyone, just like Sam Altman.)
_345
no she opted in
frumplestlatz
> I've allowed anyone to make "cameos" using my face. (You don't have to do this …)
… and this is probably where the article should’ve ended. Or in fact, where the author should’ve realized there didn’t need to be an article at all.
People are weird and gross. They do weird things that we would often prefer they not. Sora provided a tool to avoid that weirdness. Use it.
torginus
People always have been pervy and gross in private and society has always provided outlets for them to do so while preserving their privacy and dignity.
However, using a wannabe AI social media platform to engage with this stuff (and said platform encouraging you to do so), is crossing several uncomfortable lines for me.
frumplestlatz
It crosses lines for me too. So I won’t be allowing people to use my face. QED.
torginus
I mean any sort of involvement in this in any way, either making stuff or being used as a face model or anything. This is semi-public and content you make is associated with you, and so is the content made about you.
Lerc
This does appear to be her shtick. Engaging in a thing so she can report upon it.
https://www.businessinsider.com/threads-meta-engagement-rage...
mcphage
That is something of a valuable service. People might see that option and think “well, why not? What’s the harm?” This way they can learn what the harm is, without having to be the person being harmed.
Lerc
I agree that this can be a method that can provide insight. It's also one of the areas where journalistic ethics are most strongly considered.
A journalist has done good work if they report on their ability to smuggle a replica bomb onto a plane. It's a bit hazier if they smuggle a real bomb on board because it puts people at risk. They shouldn't blow up a plane to show how easy it is.
I didn't offer any judgement in my comment as to the ethics of this particular reporter, just noting that was the style that they do.
The claims she purported to represent Meta in public statements would, if true, count as unethical journalism. I don't know of the accuracy of those claims, so at this stage I would remain undecided but wary.
mslt
That’s called Journalism
GaryBluto
All evidence points to the author trying to make this a moral panic, especially with the emphasis on "real women" being used to generate these (despite the feature meaning they have consented to it)
JKCalhoun
Will you be a little outraged though when they use real women without their consent?
If we know anything about software in general (and AI specifically) getting around roadblocks is often a fairly simple thing.
exasperaited
Yes. And we also have documented cases where generative AI, in the hands of people with serious psychological issues, is significantly accelerating those people's loss of control of those issues, to very negative outcomes.
The fact that the AI industry is apparently littered with incredibly immature guys who perceive themselves to be Randian superheros does not reassure me that this tool is going to be better.
GaryBluto
> Will you be a little outraged though when they use real women without their consent?
It's a complicated issue that I've considered many times before. If we deem deepfake pornography unethical because it creates images/videos that look like real people, what does this mean for "lookalike" pornography, featuring actors done up (or who just naturally look) like famous people?
For example: Let's say Person A has a friend, Person B, who looks like Person C. Person B consents to Person A using artificial intelligence to generate pornographic images of them, which in turn look like pornographic images of Person C. Should Person A need consent from Person C?
panny
>Will you be a little outraged though when they use real women without their consent?
Probably not. AI slop doesn't really "go viral" except when it is super ridiculous like shrimp Jesus. Most people generating AI slop porn are likely in the 10s of people who will see it. If someone generates porn with my face on it and I never even know, how does this harm me? Why should I care?
panny
Isn't generating fetish content a legitimate business model? Doesn't anyone remember the song,
>The internet is for porn
I actually think this is what's going to happen with AI once the easy money dries up. They'll quickly race to the bottom selling porn generators. AI slop porn already seems like the majority usage after homework generation.
QuadmasterXLII
The article is describing a feedback loop: very few women consent to have their face be usable -> the number of perverts vastly outnumber the number of women so each woman, who has posted nothing nsfw, or even suggestive, gets dedicated attention from many perverts -> other women aren’t comfortable allowing their faces to be used by large numbers of perverts-> very few women consent to have their faces be usable.
This is all independent of what is and isn’t a legitimate business model, it’s a social dynamic. It’s also a pretty familiar one: it shows up everywhere from nightclub bouncer policies to the dynamics of early 2000s irc rooms
threatofrain
The number of perverts vastly outweigh the number of women? Um?
panny
You can also use photoshop to glue a woman's head on a porn star's body. The results are about the same as AI slop. AI is just faster.
yesbut
might? ascii art has a pervert problem.
appreciatorBus
Yup. Likewise cinematography, photography, fiction writing, spoken language, and cave paintings, all have had a pervert problem since day one.
yesbut
humans are perverts. who knew?
doganugurlu
I have no idea how I would feel if someone used my face in fetish or sexual content. Never happened. It’d probably make me uncomfortable. But, I imagine I would be ok if I grew up in a culture where sex wasn’t as much of a taboo. Maybe I would find it flattering.
My mental test for deciding whether something should be illegal or unacceptable is questioning if any one would see it the same if religion never existed.
During #metoo I remember reading an article where the author was uncomfortable with the “drug fueled sex parties in Silicon Valley.” They basically didn’t want consenting adults to do drugs or engage in group sex. The argument against fetish content with AI generated characters reminded me of the #metoo author’s discomfort with the drug/sex freedom of the Bay Area. The article about Sora sounds like the author is uncomfortable with people generating fetish content, regardless of the content featuring real people or not.
It’s sad that the liberals now include the prudes/conservatives.
frumplestlatz
I don’t think one is a prude or a conservative for not wanting AI generated porn of themselves to exist.
I also don’t think one is a prude or a conservative for thinking there are consent and power issues around anything that commingles sex and the workplace.
Things can be unacceptable without being illegal. Things can even be unacceptable without needing to be banned or privately controlled.
My bar for what should be unacceptable is a lot lower than my bar for what should be illegal or privately banned.
Making weird pregnancy fetish videos of real people without their permission is definitely unacceptable. I have no issue with the idea that anyone doing that should be shamed.
doganugurlu
I was pretty clear about my agreement with the real people AI fakes issue. And pretty clear on what I considered to be prudish. Would mind reading my comment and editing your response? Or you’re responding to someone else?
Edit: the fact that the author is right to be uncomfortable with their face on generated fetish content, doesn’t make their stance on fetish content with generated characters less prudish. They can be right about one thing, and prudish about the other.
> And how do you stop people from making fetish content of purely AI-generated characters that aren't cameos of real people? Does OpenAI want to stop that? Maybe OpenAI thinks it's fine for people to make belly-flation or foot-fetish videos as long as they're not of a real person.
I can't figure out the tone here. Is the author suggesting we should stop people from creating fetish content of purely AI-generated characters? OpenAI might want to for business reasons, but surely there's nothing inherently wrong with using AI for fetish content. Should we also stop people from drawing fetish content with pencil and paper?