Meta apologises over flood of gore, violence and dead bodies on Instagram
61 comments
·February 28, 2025r0fl
mrtksn
On the "Threads" app I got very similar experience, a lot of breastfeeding erotica and other weird adult content that made me feel I'm doing something illegal.
Also, their algorithm must be assessing human features too because at some point Instagram started showing me women with ever largest breasts. I get that they try to understand what I'm into but it's utterly ridiculous to push for the more and more extreme.
I wonder how large is the cohort for all these weird stuff, considering that Instagram itself works as "not a creep" credit score when you meet someone new.
Recently I gave in, started following some of those weird accounts and bookmarking deep fake scam ads out of fascination, probably adding fuel to the fire. It's just immensely weird.
mrbonner
You can say the same thing about Snapchat. My kid had this on their phone last year. I check the phone from time (12 yo) to time to make sure there is no weird stuff or message in there. I open Snapchat and saw the "recommend feed" with very "strange" thumbnails. I deleted the app and told them not to install again. I put the restrictions to their apple account to explicitly forbid apps for 4 year+. Somehow Snapchat is for 10+. Am I crazy?
karmakaze
I don't use Instagram or any social media apps--I do use webapps. For Youtube if I'm tempted to click something I wouldn't have searched for myself I use open in private. I mark watched videos with a Like. I also actively click "Not Interested" and even "Do not recommend channel". My recommendations are narrowly curated. I even got Youtube to 'run out' of recommendations showing me the same set on reloads.
penr0se
Same here. I had to create a custom uBlock filter for YouTube because it ran out of recommendations and half of the suggested videos in my home page were from my Watch Later playlist.
whalesalad
this is why instagram is so fun. a lot of people have their friend group chats in imessage, wechat, fb msg etc... but the best group chats i've ever belonged to are on insta and the unhinged meme feed is glorious.
milesrout
So you click on this crap and you are surprised it is what you get? What other signal does it have to go on! Doctor it hurts when I punch myself on the nose. Don't do so then.
I "not interested" everything I don't want to see on every social media platform and never have any of these issues people complain about.
UncleMeat
Shocking and enraging content activates minds. These products don't just count likes but also things like pausing your scroll over a piece of content. It is really hard to avoid the immediate lizard-brain response of pausing over shocking content, leading to more and more of this content. The net effect is a product that doesn't make people feel happy or enriched but a product that makes people feel angry and stressed but in a way that is difficult to pull away from.
r0fl
I’m surprised the “carp” is there in the first place. Without me clicking anything at all, on a brand new fresh account with zero prior interaction with any content
faxmeyourcode
As always, it's the users who are at fault. Nevermind the thousands of kids and teens who are exposed to this shit day in and day out.
arkh
Before social media you had sites like rotten. And if you were a teenager with access to internet you perused some of those.
milesrout
It is the user's fault when the user says in his post that he doesn't interact with the algorithm or give it anything to go on except to click on content he doesn't like and then lo and behold it gives him content he doesn't like.
Children should not be on social media at all.
chewbacha
This was the original report from 404media[0]. They are doing really great work over there.
[0] https://www.404media.co/instagram-error-turned-reels-into-ne...
MrMcCall
Seconded. They have been doing great work for a few years now.
sdwr
If this is because of reduced moderation, does that mean the algorithm defaults to shock content? That gets the most attention, and they're artificially suppressing it most of the time?
If the attention economy is a race to the bottom, then the most popular content should be stuff that doesn't quite break the public decency rules.
drpossum
> If the attention economy is a race to the bottom, then the most popular content should be stuff that doesn't quite break the public decency rules.
I'm not sure I follow if that's indeed what gets people's attention and engagement, good or bad.
micromacrofoot
it certainly means they can successfully detect it, which makes it all the more abhorrent that they don't moderate it very well (or at all)
mc32
Ultimately it’s because users consume shock content; if they didn’t it wouldn’t be surfaced and instead they would get whatever generates views be it cats or rainbows.
cardanome
No, stop blaming the users.
Look at this comment: https://news.ycombinator.com/item?id=43204869
It clearly shows that passively scrolling on a fresh account will get you gore in no time.
Yes, as humans, shock content makes us look but that doesn't mean users want this kind of content. They are being manipulated by these algorithms. Social networks are optimizing for "engagement", meaning ultimately profit, not what users actually want.
This is the lazy excuse of every company ever hurting society. Oh, the users want it. No, nobody asked for more gore on Instagram. You are just wanting to save money on moderation.
neuroticnews25
"passively scrolling" is still sending a stream of user actions and choices. But you're right, I trained myself to swipe past animal cruelty and ragebait on FB as fast as possible and still get it recommended. Maybe in some indirect way making me feel bad still increases my engagement.
ourmandave
I remember when ISIS posted that beheading video of a journalist, I had to go out of my to avoid seeing it. And that was before Instagram.
Arnt
There are two things here. The company has an algorithm that tried to find out what makes users click/scroll onwards/return soon.
And then there is what users like.
The company chooses to act as though the two are equal. That's not given. Nobody forces Insta to assume that.
intended
What did you mean stop blaming the users?
You mean that the firms are incentivized to show outrage inducing content to gain attention?
If so, that’s also NOT the problem with firms. You, me, and everyone with a phone is in some form of information and attention war. Or over farming situation.
- There is a competitive process for attention - What is crazy today is boring tomorrow, as people get desensitized - all content competes with all other content, for the little working memory that people have.
Personally I describe it as a gearing ratio issue. We have one small gear which is everyone’s short term attention.
In the mean time, we have a larger gear which is the content cycle. This gear is spinning faster and faster as the velocity of content generation increases.
Anyone who takes a stand to slow down or make their content less attractive, will get out-competed by more competitive content.
Any moderation of this process breaks the lay definition of free speech.
So you have a runaway effect where the content must get more outrageous.
This isn’t the fault of anyone, but a fundamental problem when free speech meets faster and faster content creation tools.
We’re at the point where GenAI slop will simply out compete everything, and grow everywhere like a weed.
I’m telling people I know that you have to give up on facts, the signal to noise ratio of the future is going to be hopelessly poor.
Instead, you will have to make do with whatever content you are looking at, and focus on the techniques which you use to interact with the content or people.
Its hard to blame a singular firm for this, when anyone who takes a stand will be crushed by the wheel of competition and free-speech.
nullc
I read that comment's "doom scrolling" as implying that it intentionally watched negative-emotion-producing-material. Is this an incorrect reading? If so how does "doom scrolling" differ from "scrolling"?
brookst
It’s somewhat more nuanced than that; a classic sampling problem.
When Instagram knows nothing about user X, it trends to shocking content that user X may not want. Which is bad.
But the reason it does so is that the shocking content has higher engagement, in average, among users A - W. It is not right to blame user X, and it is very likely that users D, H, and M are dismayed at the shock content but can’t help slowing down when they scroll past it.
So we can blame Meta to some degree, and users A - W (except D, H, and M) to some degree, and users D, H, and M to some degree.
But if the total ad revenue from users A-W was lower when shock content was featured, user X would not be presented with it.
Of course, Meta could just be decent. But so could the other users. The fact that neither are true is an emergent property of sociopathic corporations and an unfortunate side of human nature.
rightbyte
I had this experience where Facebook.com started to spam me more and more car accident videos. Probably since I hovered over them to block the sender.
I would guess that there could be a similar effect here. And I guess my eyes looks longer on a car getting stuck on a rail road crossing than something that is not inheritly dangerous, too.
UncleMeat
It is hard to look away from somebody beating a dog with a belt. In our daily lives we are ethically trained not to look away from this sort of thing.
Even just pausing your scrolling over a video is enough to train your feed to show you more videos like that. So when you encounter something like animal cruelty in your feed and you, quite naturally, pause for a moment in shock you will start seeing more of this content. Again, as you pause in shock on this content you'll keep seeing more. And more.
And we don't permit this "it is because users consume this content" as a general rule. Users consume hardcore porn. Nevertheless, these platforms seek to not show people hardcore porn.
Tade0
It only surfaces because the algorithm is set to a simple more engagement = more visibility. If it was removing outliers, then this would not only not be the case, but also arguably create a more pleasant experience for everyone.
Of course greed prevails, which is why we are where we are.
xtiansimon
Which users most heavily weight algorithms for viral effects? The guy who looks at dogs and bikes, or the teen user who doesn’t know their arse from their elbow?
gosub100
> users
survivorship bias. you aren't account for how many users were repulsed or shocked (read: shook up, not a momentary "wtf") and left the platform.
KaiserPro
So what I don't understand is that its perfectly able to categorise this content really well, but is perfectly happy to allow it on the network, even though it clearly violates the community guidelines.
Yes moderation at scale is hard but most of the battle is getting decent categorisation.
jon-wood
The recommendation algorithm isn't necessarily identifying this as shock content or gore, all it's doing is identifying it as having the hallmarks of something that a certain circle of people engage with, followed by it deciding that everyone engages with it and it should be the default. It's possible that internally it's tagging it as gore, it's also possible that it's simply categorising it as "fast moving video with an abundance of red".
soco
I would assume after such a successful AI based categorization, one single human moderator could do wonders deciding whether it's an abundance of red or something else. But yeah, why would we censor the money making?
r0fl
The extreme content gets the most user interaction and leads to higher daily active users
Which is a very important metric.
As long as the content is not illegal, if it positively affects time on site/app, it stays for as long as it can on the platforms
KaiserPro
> The extreme content gets the most user interaction and leads to higher daily active users
For some people yeah, but for a lot, it causes the opposite. That's why its quite heavily policed. Its not like thirst traps which are liberally sprinkled in the first 5 minutes of creating an account.
donatj
Mine has become all things that look like genitalia at a glance but aren't...
I'm guessing some algorithm picked up on me stopping on and being like "what the hell?" as liking it...
theshrike79
> "We have fixed an error that caused some users to see content on their Instagram Reels feed that should not have been recommended"
How did that stuff exist on their platform at all?
pbalau
Users post it?
theshrike79
Shouldn't a multi-billion dollar company have moderation in place to tag content like that?
I'm pretty sure if you start posting spoilers about Marvel properties or straight-up pirated content it'll be taken down as fast as possible.
But gore? Nah, why bother.
jajko
And nobody really moderates it... pretty sure meta is breaking some laws at least in some jurisdictions but this is nothing new
I have in recent year maybe seen a massive increase of various fake posts from unknown groups on facebook. Made up shorts, AI generated fake photos of various weird outrageous claims, which all upon some side check turned out fake but it was not obvious on first glance. I try to report them, mark them as not interesting etc. but they keep popping back without me ever being interested in them.
Meta is really shitting on its users and just milking current status, or doesn't care how content on their platforms degrade further, or even quietly allows it. I'd say its the second choice, and it is really by choice. Youtube has similar potential but somehow they manage to sanitize it all well, so clearly its possible.
brookst
Right? Not a fan of Meta, but the flip side of this is “how dare those censorious corporate overlords limit what I can post to my account”
Jotalea
I heard a rumor that certain account was planning a raid on Instagram for today (feb 28 2025) a couple months ago, that said "something will happen on that date". But I just thought it was some kind of weird advertising method from Meta; it kinda worked though, it got me to open the app (I'm too curious).
jmclnx
Never used it, but how about apologizing for the neo-NAZI items too ? I have to assume since they now allow right-wing propaganda.
MrMcCall
Mark Zuckerberg will never be happy, after all the horrors his businesses have helped perpetuate, the kinds of lifestyle his brands perpetuate, the folks he has helped put into power.
The look on his face at the inauguration is a testament to how awful he feels every minute of every day. Few people in history have caused as much misery to so many people.
We ALL reap what we sow, for good or ill, so I suggest doing things that make other people happy, and never cause them misery or grief.
Look at the face of Lebron James when he surprised the kids at his personally-funded school in Akron, OH. His smile demonstrates the joy that is possible when one spends their money on others' happiness.
"There's still time to change the road you're on." --Stairway to Heaven
Happiness requires making money in an ethical way, and then spending it ethically after having earned it.
AlecSchueler
Had to delete mine a few weeks ago when I suddenly got a pro-book burning reel, showing Nazis burning books with the caption "They tell you they burned books, bit never which books."
It had 30,000 likes and the top comments were all along the lines of "The victors write history, they will never say the bad guys won WW2" and "Hitler was just fighting the Jewish transgender conspiracy in Germany. Trump will defeat it in America" etc.
I used Instagram to follow traditional musicians and people tailoring their own historical clothing. But I can't stay on a platform that is so happy to give a place to that kind of outright hate speech.
r0fl
I haven’t been served that type of content but I feel the same way as you lately. It feels like it’s shock content with a race to the bottom to get users attention instead of showing interesting content I would actually like
MrMcCall
We are all choosing sides, each and every day, between compassionate concern for our fellow human beings, and callous disregard for others' happiness.
Choose well, everyone. The karma you reap is always exactly what you deserve, for good or ill.
And the pleasure of having power over others is not a good replacement for peace of mind, joy, and happiness.
MathMonkeyMan
Instagram is ad matching machine. Meta is not in the business to care about people.
RodentQueen
[dead]
Instagram is borderline a porn/gore website if you open a new account and spend one day doom scrolling
The amount of nsfw content including women “breastfeeding” fake babies, wild accidents, weird ultra graphic shock videos is insane.
I have a few accounts that I use to post for work. My explore pages are wild compared to my personal account which is mostly tech, cars and architecture
I don’t “like” anything on my work accounts, ever, the algorithm just knows what to feed me because of what I click on the most (impossible not to click on those thumbnails)