Self-hosting your own media considered harmful according to YouTube
695 comments
·June 6, 2025w14
hshdhdhj4444
Yeah because if it wasn’t for COVID YouTube, Facebook, et al would never have removed any content on their platform, unlike what they had been doing all this while…
There are so many issues with this.
Being able to pick what content they host is fundamental to freedom of speech for private entities.
The real problem is twofold. 1. A few platforms hold monopoly positions. Who else can compete with Youtubr? And the reason isn’t necessarily because YouTube has a particularly better UI that keeps viewers and content creators on it. The reason YT has all the content creators is because it leverages Google’s ad monopoly and is able to help creators make money. A decently functioning anti-trust system would have split google ads from the rest of the company by now.
2. The devastation of the promise of the open internet. VCs have spent hundreds of billions of dollars to ensure we remain in walled gardens. Open source, self hosted, software on the other hand, where the benefits are shared and not concentrated in individual hands which can then spend billions to ensure that concentration, has suffered.
We need govt funding for open source and self hosted alternatives that are easy and safe for people to setup.
Combine the two and instead of YT getting to choose what videos are seen and not seen on the internet, major and small content creators would self host and be the decision makers, and still make similar amounts of money because they could plugin the openly available Google Adsense (kind of like how you can on blogs…).
somenameforme
I think their real edge is a practically free and practically infinite bandwidth/capacity global CDN setup. There's no real technical reason for this still to be the case, but bandwidth costs are significant for people relying on other services to provide such. Or they're cheap and slow/capped.
This is the main reason I think alternative sites have a hard time competing. Play anything on YouTube from anywhere and if it's buffering/slow then it's probably your internet connection that's the problem. By contrast do the same on competing streaming sites and it's, more or less, expected especially if you aren't in certain geographic areas.
Monetization on YouTube is mostly just a carrot on a stick. The overwhelming majority of content creators will never make anything more than pocket change off of it. That carrot might still work as an incentivization system, but I don't think it's necessarily the driving force.
luhn
Yeah, anybody can make a half-baked CDN, but Google has PoPs inside ISPs across the world [1] and competing with that is essentially impossible.
[1] https://support.google.com/interconnect/answer/9058809?hl=en
tshaddox
I have to imagine that YouTube also has massive storage requirements that are a non-trivial portion of Google’s storage costs.
sfn42
I'm not really disagreeing with you but I have a 700/700 fiber connection that generally works perfectly for anything I do, and youtube craps out pretty frequently. It'll just fail to load videos and I have to refresh up to multiple times before it starts working properly.
Also the frontend is generally very wonky, I'm wondering if its severely over engineered or something. It seems very simple, but it's failing at all kinds of stuff all the time. Shorts fail to load when scrolling, the scrolling just stops working, some times it keeps playing the previous video's audio while the current video is frozen.
Some times if I write a comment and try to highlight and delete some of it, when I hit backspace it deletes the part that wasn't highlighted. A normal <input type="text" /> does not do that. Have they implemented their own text inputs in JS or something? All you need for that component is a form with a textfield and a submit button. As far as I know that won't behave this way so I'm not sure what they're doing but it doesn't seem great.
I went and checked, it's a div. No idea why they would do that for that simple comment form.
clucas
Plainview: You gonna change your shipping costs?
Tilford: We don't dictate shipping costs. That's railroad business.
Plainview: O-oh! You don't own the railroads? Course you do. Of course you do.
mbrumlow
> Being able to pick what content they host is fundamental to freedom of speech for private entities
I simply don’t think this applies to places like YouTube.
But if does then they also must be responsible for the content. It makes no sense that curating content is their free speech but at the same time it’s not their speech when the content could have legal repercussions to them.
The argument that removing videos is their speech implies that hosting videos is their speech. So they should be liable for all content they post.
beej71
They are two different things, though. One is actually producing content, and the others deciding which content host and share. And there are all kinds of various legal and illegal combinations, here. For instance maybe they decide that it's okay to host Nazi content, something that is absolutely protected under the first amendment. Or maybe they decide that it's not okay to host Nazi content, even though it's definitely protected under the first amendment.
Also see Gonzales v. Google.
But really the most dangerous thing here is telling a company that they are legally liable for everything their users post. A large company like Google has the legal firepower to handle the massive onslaught of lawsuits that will instantly occur. A smaller startup thing? Not a chance. They're DOA.
Heck, even on my tiny traffic personal website, I would take the comment section down because there's no way I can handle a lawsuit over something somebody posted there.
I should not be required to host content I do not wish to host. And at the same time I must be shielded from liability from comments that people make on my website, if we are to have a comment section at all.
brookst
That would make sense if this were a math theorem, but law and liability and society don’t usually work like math.
Theee things can be true:
1. YT and similar give people a platform for speech
2. So long as they make a good faith effort to identify and remove content that is illegal, the hosted speech is not theirs.
3. As platform owner they are also free to exercise speech by moderating topics for any or no reason
tzs
> The argument that removing videos is their speech implies that hosting videos is their speech.
There is no such implication because the first is an affirmative act based on their knowledge of the actual content and the other is a passive act not based on knowledge of that content.
zmgsabst
That’s my opinion:
If you exhibit pre-publication restraint, you’re an editor of an anthology — and not an information service hosting user content.
LocalH
Why should "YouTube" as an entity enjoy freedom of speech? They're a platform for user-generated content. Outside of outright illegal content (which is even tenuous sometimes, I'd like to reserve this for the worst of things), they shouldn't be able to pick and choose which UGC they are willing to allow. They're the modern "town square". They're effectively a monopoly in this day and age (yes, there are other video hosting platforms, but YouTube has the largest share of all by far, and are de facto the place people expect to find video UGC).
Serving video with high availability to millions of people is hard. Few organizations, that aren't already flush with capital, are going to be able to replicate that at any sort of scale.
I'm tired of big corporations using their might to override individual freedom of speech. Once you reach a certain size, you should have to make moderation a more personal thing. Instead of taking videos that aren't illegal in and of themselves down, they should have to empower the user to moderate their own feed. Of course, this is incompatible with the modern drive to use these platforms to push content in front of people, instead of letting them curate their own experience.
I don't have all the answers, but the "corporations = people, and thus corporations have freedom of speech" angle has done a lot of damage to the rights of individuals.
null
int_19h
I think one thing that we should be more cognizant about in general is that corporations are a legal construct to begin with, and as such, there's no natural right to incorporate - it's strictly a privilege. So society attaching even very heavy strings to that is not unreasonable so long as they are applied consistently to all corporations. Which is to say, if corporations don't do what we as a society want them to do, beating them with a large and heavy stick until they start doing that is not wrong, and we should be doing more of it.
And if people really want their freedoms, well, they can go and run their business as individuals, with no corporate liability shield etc. Then I'm fine with saying that their freedom of speech etc overrides everything else.
bigbadfeline
> A decently functioning anti-trust system...
Unfortunately, it's a tall order in the current political environment for the same reason open source funding isn't forthcoming, these are just parts of a bigger problem which is best discussed elsewhere.
With that said, you're absolutely right in your assessment, this is approximately what needs to happen in order to improve the current sorry state of media and public discourse. Sadly, as evidanced by the other replies to your comment, the public at large simply doesn't get it and the situation is even worse with the structural changes needed to make a real solution possible.
It's a vicious cycle that results in ever worse media, and not only media. The current public spat between the two smartest people in the world (by mass media metrics), garnished with public blackmail attempts and private-social media channels, is a jaw dropping proof of dysfunction but ofcourse the media presents it as casual entertainment.
LocalH
> Sadly, as evidanced by the other replies to your comment, the public at large simply doesn't get it and the situation is even worse with the structural changes needed to make a real solution possible.
The ones with money and power (which are effectively the same thing) want it to be this way, as it makes them richer and more powerful. The masses are just pawns literally being moved around on the chessboard of society.
AnthonyMouse
> Being able to pick what content they host is fundamental to freedom of speech for private entities.
Here's some text from Section 230 of the CDA:
> (c) (2) Civil liability
> No provider or user of an interactive computer service shall be held liable on account of—
> (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
> (B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1)
...
> (e) (1) No effect on criminal law
> Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute.
Now in this case, you have YouTube, a service with obvious market power, taking down content promoting a competitor to YouTube. There are Federal criminal antitrust statutes.
intended
One thing I really wish, is that more people volunteered to moderate things. It’s a volunteer position, it’s needed for most of the communities we are part of, and doing this raises the floor of conversations across the board.
The distance between the average view point on how free speech works, and the reality that content moderation forces you to contend with, is frankly gut wrenching. We need to be able to shorten that distance so that when we discuss it online, we have ways to actually make sense of it. For the creativity of others ideas to be brought to bear.
Otherwise, we’re doomed to reinvent the wheel over and over again, our collective intuitions advancing at a snails pace.
null
driverdan
Why would you volunteer your time to a for-profit company?
ClumsyPilot
> Being able to pick what content they host is fundamental to freedom of speech for private entities
Interesting position - when somebody posts illegal content on YouTube, they are not liable, it’s not their speech.
But when I want to post something they don’t like, suddenly it’s their freedom of speech to remove it.
A lot of breakdown in society lately is clearly coming from the fact that some people/companies have it both ways when it suits them.
akimbostrawman
The solution would be to revoke section 203 from any platform which acts as a digital public square if they do moderation beyond removing illegal content.
Ofc they would try there best to be excluded to have there cake and eat it too.
intended
Sadly it turns out that the biggest driving force is politics, and the inability for our institutions to win with boring facts, against fast and loose engaging content.
The idea is that in a competitive marketplace of ideas, the better idea wins. The reality is that if you dont compete on accuracy, but compete on engagement, you can earn enough revenue to stay cash flow positive.
I would say as the cost of making content and publishing content went down, the competition for attention went up. The result is that expensive to produce information, cannot compete with cheap to produce content.
tzs
Your premise is incomplete. When someone posts illegal content on YouTube they are not liable if they are not aware of the illegality of that content. Once they learn that they are hosting illegal content they lose their safe harbor if they don't remove it.
Sloowms
This is correct. In the US tiktok is currently being sued for feeding kids choking game content through the algorithm that was earlier judged to be free speech.
gspencley
> Interesting position - when somebody posts illegal content on YouTube, they are not liable, it’s not their speech.
> But when I want to post something they don’t like, suddenly it’s their freedom of speech to remove it.
There is no contradiction there.
Imagine a forum about knitting. Someone, who has it in for the owners of this knitting forum (or perhaps even just a SPAM bot) starts posting illegal, or even just non-knitting content on this forum.
The entire purpose of the forum is to be a community about knitting.
Why is it the legal or moral responsibility of the knitting forum to host SPAM content? And why should they be legally liable for someone else posting content on their platform?
You're equating specific pieces of content with the platform as a whole.
There is no reality where I will accept that if I create something. I spend and risk my money on web hosting. I write the code. I put something out there... that other people get to dictate what content I have to distribute. That's an evil reality to contemplate. I don't want to live in that world. I certainly wont' do business under those terms.
You're effectively trying to give other people an ultimatum in order to extract value from them that you did not earn and have no claim to. You're saying that if they don't host content that they don't want to distribute that they should be legally liable for anything that anyone uploads.
The two don't connect at all. Anyone is, and should be free to create any kind of online service where they pick and choose what is or is not allowed. That shouldn't then subject them to criminal or civil liability because of how others decide to use that product or service.
Imagine if that weird concept were applied to offline things, like kitchen knives. A kitchen knife manufacturer is perfectly within their rights to say "This product is intended to be used for culinary purposes and no other. If we find out that you are using it to do other things, we will stop doing business with you forever." That doesn't then make them liable for people who use their product for other purposes.
andrepd
> A lot of breakdown in society lately is clearly coming from the fact that some people/companies have it both ways when it suits them.
See how copyright is protected when it's whatcd violating it and when it's OpenAI
mardifoufs
Mhmm, so would it be fine for a private platform not allowing say, Muslims on their website? Especially a platform as big as YouTube? I mean, it's essential to their rights to be able to do that, I guess?
Like I understand your point, but this argument is usually not actually useful. Especially since it's usually not coming from "free speech absolutist" types, so it always comes off as a bit disingenuous. Unless you are arguing for big corporations having an absolute right to free speech, which I would disagree with but would at least make the argument consistent.
dragonwriter
> Mhmm, so would it be fine for a private platform not allowing say, Muslims on their website?
Depends on the sense of “private”.
If it is, private in the sense that it is a platform run by a Christian Church for the use of organizations affiliated with that Church, and not offering information dissemination to the general public, sure.
If its a private business offering platform services to the public at large but specifically excluding Muslims, then it is potentially engaging in prohibited religious discrimination in a public accommodation. Unlike religion, political viewpoint is not, federally, a protected class in public accommodations, though state law may vary.
(OTOH, under the federal Religious Freedom Restoration Act and similar laws in many states, and case law based on and in line with the general motivation of such laws, laws including state public accommodation laws, are being looked at more skeptically when they prohibit religious and religiously-motivated discrimination, as an impairment of the religious freedom of the discriminating party, in theory irrespective of the religions on each side, but in practice favoring discrimination by Christians and against non-Christians, so possibly the Muslim exclusion would succeed even in a public accommodation.)
pr0zac
I don't think anyone would argue that would violate freedom of speech, however it would still be illegal as it would violate the civil rights act by discriminating based on religion. Theres more than one right involved in your hypothetical basically.
Lutger
I don't see how one necessarily leads to the other. There's obviously already filtering going on in youtube, even before covid, on illegal content and also on legal content that is against the policy (adult content for example).
How is Covid desinfo during the pandemic suddenly a slippery slope for anti-competitive measures, while all the other moderation measures aren't? Whats so special about anti covid desinfo rules?
I think we really need a better argument than 'making any rule leads to making bad rules, so we better have no rules'.
aleph_minus_one
> Whats so special about anti covid desinfo rules?
- The magnitude of content involved.
- The fact that there exists a significant part of the society which is vocal about not endorsing these particular deletions.
- The fact that many people became aware of the moderation ("censorship") that YouTube does and its power.
- The fact that these COVID information videos (despite being perhaps wrong) formed important patterns of opinions, i.e. some opinions considered "extremist" or "wrong" were suppressed.
shakna
We also suppress videos on the correct manufacturing process for plastic explosives. Not because doing it safely is a bad idea, but because proliferating bomb making materials is.
Covid disinformation got people killed. It will continue to get people killed, especially with a proponent of it leading the US health service.
Things likely to lead to death, are likely things you do not want on your platform.
thrance
[flagged]
ulrikrasmussen
> Many will say, this is good, children should be protected. The second part of that is true. But the way this is being done won't protect children in my opinion. It will result in many more topic areas falling below the censorship threshold.
For example, YouTube currently has quite a lot of really good videos on harm reduction for drug users (and probably also a bunch that are not very good and/or directly misleading). I would expect all of such videos to be removed if such a child protection law was passed, because any neutral discussion of drug use apart from total condemnation is typically perceived as encouragement. That would deprive people of informative content which could otherwise have saved their lives.
actionfromafar
All these concerns are muddled by thinking about Youtube as the example, since it is such a blind meta machine optimising for ad revenue, it’s already actively pushing all kinds of harmful content.
SkyBelow
The problem with any laws for a good purpose is that, even if you can get everyone to agree on the general statement of a good purpose, there are disagreements on what actually counts as achieving the goal from both a moral and a scientific level.
For example, providing information on how to do something harmful X more safely might increase the risk of people doing X. On the moral side, someone might argue that even 1 more person doing X is worse than the reduction in harm of the others doing X. On the scientific side, there is likely not direct evidence to the exact numbers (ethical concerns with such research and all that), so you'll have some people disagreeing on how much the harm is increased or reduced and different numbers can both be reasonable but lead to different conclusions given the lack of direct research.
This all becomes supercharged when it comes to children, and you'll find people not even be consistent in their modes of thinking on different topics (or arguably they are consistent, but basing it off of unsaid unshared assumptions and models that they might not even be consciously aware of, but this then gets into a bunch of linguistic and logic semantics).
andrepd
Big tech censorship disgusts me. Everything is completely backwards from what it should be, and the sheer scale of those platforms (bigger than many countries by population or money) prevents individual people and even governments from exercising meaningful democratic oversight. So these platforms congregate hundreds of millions of people and whatever their CEOs and/or douche tech bros in SV decide is what becomes law.
Another example: videos about the holocaust or WWII atrocities. Every one of them demonetised and hidden from recommendations because it touches a horrifying topic. Harms the children? On the contrary, nothing more important in an age of global fascism waves than a lesson in how it went last time.
Meanwhile the whole platform is a cesspool of addictive brainrot, gambling ads, turbo-consumerist toy unboxing videos, etc. Things that are actually truly harmful to kids. These are not restricted, these are promoted!
War is peace etc etc. Good is evil and evil is great. Everything is backwards.
I hate this so much.
ndriscoll
The thing to understand is your last paragraph: everything big ads does is unsurprisingly focused toward making people into worse versions of themselves. You wouldn't let kids go to a casino or porn site for educational material. Don't let them use youtube either.
It could be that someone happened to post educational videos to the porn site. If so you'd might as well download them while you have the chance, but don't mistake their existence for some indication that that's what the site is for. They're still less than 0.1% of the videos, and you'd need to specifically search for them or be linked to them to find them. Assume you'll need to look elsewhere for educational material. e.g. there are 10s of thousands of results for videos for "Holocaust" on worldcat.
ulrikrasmussen
I agree 100%, and in another comment I also suggested an alternative to child protection laws, namely that we should severely restrict the viability of the ad tech business model altogether. While it does make certain niche content creation financially viable which otherwise wouldn't, in the grand scheme of things the negative externalities outweigh the good.
KurSix
Once you normalize vague enforcement around "problematic" content, the net just keeps widening
slg
These slippery slope comments always seem a little naive to me because they imply there is some pure way to handle moderation. In practice, you have to be an extremist to think literally no content should be removed from Youtube with the most obvious example of something nearly everyone wants to be removed being CSAM.
Maybe you would respond by saying that is illegal and only illegal content should be taken down. According to which laws? Hate speech is illegal some places, should that be removed? What about blasphemy?
Maybe you would suggest to closely follow the local law of the user. Does that mean the site needs to allow piracy in places that is legal? And who decides whether the video actually violates the law? Does the content have to stay up until a court makes the final decision? Or what about content that is legal locally, but might be under some restrictions. Should Youtube be obligated to host hardcore porn or gory violence?
There needs to be a line somewhere for normal people to actually want to use the site. I'm not going to claim to have the perfect answer on where that line should be, but there is always going to be an ongoing debate on its exact placement.
ulrikrasmussen
The problem is the nature of YouTube, which is a platform with the main purpose of generating revenue based on advertisement while minimizing their own operational risk. YouTube does not care one bit about whether the content they show is informative, harmful or entertaining, they care about maximizing the amount of ad impressions while avoiding legal repercussions (only if the legal repercussions carry a significant cost, of course). This naturally leads them to err on the side of caution and implement draconian automated censorship controls. If the machine kills off a niche content creator then it means nothing in the grand scheme of things for YouTube. YouTube is a lawnmower, and you cannot reason with a lawnmower.
This is very different from past "platforms" such as niche phpBB boards on the old internet, book publishers or even editorial sections in newspapers who at least to some extent are driven by a genuine interest in the content itself even though they are, or were, also financed by advertisements.
The main problem here is that we allow commercial companies to provide generic and universal "free" content platforms which end up being the de facto gatekeepers if you have something to say. These platforms can only exist because the companies are allowed to intersperse generic user-generated content with advertisements. In my opinion, it is this advertisement-financed platform model that is the core problem here, and automated censorship is only one of the many negative consequences. Other problems are that it leads to winner-takes-it-all monopolies and that it strongly incentivizes ad companies such as Google to collect as much information about people as possible.
lucianbr
> According to which laws?
This part at least seems to be no problem. Many platforms already follow and enforce different rules in different jurisdictions.
> And who decides whether the video actually violates the law?
There are myriad laws around the world, and somehow we manage to decide what's legal and follow the law, at least most of the time. This argument is absurd on the face of it: "we can't have a law because laws are too difficult to follow and enforce".
People and corporations make their best attempt to follow the law, regulators and institutions give guidance, courts adjudicate disputes. Do you live somewhere where it works differently?
_heimdall
> Should Youtube be obligated to host hardcore porn or gory violence?
YouTube can decide to host, or not host, whatever it wants. The challenge is with unclear terms of use. They have a habit of taking down videos with little or no reason given, and it isn't clear what terms the video content would have violated.
Of course they can draw their own lines, but they should be clear and consistent.
aleph_minus_one
> In practice, you have to be an extremist to think literally no content should be removed from Youtube with the most obvious example of something nearly everyone wants to be removed being CSAM.
What is extremist about this opinion? (EDIT: with the exception that we indeed remove CSAM and similar things "everybody" wants removed and will (importantly!) otherwise get YouTube into deep trouble, but (basically) nothing else)
jaapz
> In practice, you have to be an extremist to think literally no content should be removed from Youtube with the most obvious example of something nearly everyone wants to be removed being CSAM.
This is not what is being said in the comments you are replying to, you are taking it to the other extreme yourself
timewizard
> a line somewhere for normal people to actually want to use the site.
Youtube is a private company. They can make whatever additional moderation decisions beyond the law they want. Which are in no way based on what you want but are entirely based on what advertisers want. This control effectively answers every question you raised.
In any case, Youtube is the size where it can grapple with all these questions you just posed, but anyone else hoping to challenge their monopoly or otherwise host a small collection of videos, perhaps for a specific purpose or community, now effectively cannot.
> but there is always going to be an ongoing debate on its exact placement.
Who exactly started _this_ debate? Was there some recent outcry from the citizens that their lives have become unlivable due to the lax content restrictions on social media? Really?
_fat_santa
At least in the context of Covid, the real issue I saw was not the taking down of content, it was that a very small group of people dictated what content should be taken down.
Generally speaking in the world of "science" (any field) there will always be a level of disagreement. One scientist will come up with one theory, the other will come up with another theory, they will endlessly debate until the topic is "settled" and then the whole loop repeats if another scientist thinks that the settled topic is not actually settled. Overall I would say this is a very healthy dynamic and keeps society moving forward.
What people go so mad about during Covid was not the content being taken down, it's that you had had various scientific organizations around the world straight up break what I described in the previous paragraph. During covid you had one group make endless rushed decisions and then when other scientific groups challenged those findings, the response was not what I outlined above but rather an authoritarian "I am the science" response.
This "main group" (NIH, CDC, etc) painted all those challenges as conspiracy theories but if you actually listened to what the challenges were, they were often times quite reasonable. And the fact that they were reasonable arguments highlighted the insane hubris of the "main group" and ultimately led them to loose virtual all credibility by the time Covid wrapped up.
thrance
No it doesn't. I reject your slippery slope fallacy.
The line must always be drawn somewhere, should YouTube allow neonazi content because any censorship leads to more censorship? Of course not.
SkyBelow
It is a logical fallacy if used as part of an absolute claim, but it doesn't make it always wrong when used in general statements. Some slopes are slippery, we can look at history to see this. We can't claim all slopes are slippery, this doesn't mean that no slope is slippery.
People aren't starting with axioms and then defining what absolutely will happen. People are discussing trends that appear to happen generally, but there will be exceptions. Going to college leads to a better job is a slippery slope, it doesn't always happen, but going to college is still good advice (and even better advice if one is willing to go into detail about the degree, the costs, the plans at college, and so on).
If we want to reject something as a logical fallacy, we need to consider if the other person's argument hinges on something always happening as some sort of logical proof, or if it hinges on it happening only at or above some threshold. If the first case, pointing out a slippery slope argument is a valid counter, but in the second case, it isn't and instead leads to two people talking past each other (one arguing X happens often enough to be a concern, the other arguing that X doesn't always happen, both statements that could be true).
A4ET8a8uTh0_v2
[flagged]
b800h
What confuses me is how TV App providers are going to make this work. How is the interface going to work to allow me to use YouTube on the TV whilst checking my age, and ensuring that it's me using the TV each time it's turned on? And how is a TV different to a computer? It's completely impractical.
camgunz
If you replace "Covid" with "child porn" or "animal cruelty" or "anti-semitism" you'll see how bad this argument is.
thrawa8387336
Those 3 are not even comparable to each other...
camgunz
Oh you don't think a plague that killed over 7 million people rates w/ animal cruelty?
redm
In my experience with OFCOM, Child Safety is just the gateway to a vague list bullet points including “terrorism” and “hateful” content (vaguely defined); what could go wrong??
bugtodiffer
UK's predator network was also built to protect kids but in the end is only used for copyright infringement
BLKNSLVR
I like the way Jeff signed off the article, pointing out that whilst the video has been pulled for (allegedly) promoting copyright infringement, Youtube, via Gemini, is (allegedly) slurping the content of Jeff's videos for the purposes of training their AI models.
Seems ironic that their AI models are getting their detection of "Dangerous or Harmful Content" wrong. Maybe they just need to infringe more copyright in order to better detect copyright infringement?
throwaway290
> Youtube, via Gemini, is (allegedly) slurping the content of Jeff's videos for the purposes of training their AI models
If by "allegedly" you mean that google admitted it
> Google models may be trained on some YouTube content, but always in accordance with our agreement with YouTube creators (https://techcrunch.com/2024/05/14/google-veo-a-serious-swing...)
Where "agreement" likely means "you accepted some tos 15 years ago so shut up".
> the video has been pulled for (allegedly) promoting copyright infringement
the irony...
consp
> Where "agreement" likely means "you accepted some tos 15 years ago so shut up".
I am not a content creator or business on yt but i am 99.9% certain as soon as you enter your business credentials to make money they pretty much are allowed to do as they please and change the terms without notice (to which you must agree). And because as pointed out into the article, yt is a monopoly in all but name you have to agree to it as there are no viable alternatives.
yard2010
Gemini doesn't need his video as train data, google can just torrent any content and use it as training data, just like facebook.
fer
Uh? Veo 3 is arguably the result of owning YouTube and tapping into its content. No need to torrent much if you store the largest amount of footage on Earth.
hbn
The Veo demos I saw all looked like Hollywood productions. Not like YouTube videos which are 99% garbage you wouldn't want to train off of.
Mindwipe
You mean in the way Meta is being sued for, and bluntly are almost certainly going to lose and have to pay out lots and lots and lots of money for?
libraryatnight
The hoovering of data for ads went about the same. They consume my data and told me it was for better ads - the most visible result is that I get ads for things I've already bought and it conflates searches made only in the spirit of understanding with desire. On the bright-side it's produced quite a few good jokes. "I googled Breitbart and I'm getting ads for testerone treatment and viagra!" [my wife, 2014]
The least these creeps could do if they're going to treat us like this is deliver the experience they say the evil justifies.
sp0ck
This is mass problem with almost any topic you want to share. I'm sport shooter, range officer and competition jury. You have no idea what crazy stunts YouTube do for Gun/Sport Shooting related content. YT terms containt some weirdest restriction for things like "shown magazine capacity". Wrong angle on video and your 10 round mag is seen by YouTube as 30 round and your video is gone.
You can show silencer disconnected from firearm, connected to firearm but showing moment you screwing it to end of barrel and your video is banned. There are dozens rules that are so vague that if YT wants he can remove any gun related content.
This is problem YT is not willing to fix because collateral damage costs are peanuts comparing to beeing sued and loose because some real illegal content slip trough filter. I don't expect any improvement here because there is no business justification.
freedomben
Indeed. A family member of mine had a helpful amount of income coming in from a channel of his that was gaining momentum. The point of the channel was to teach gun safety to people new to guns. Keep in mind that where we live, all of this is 100% legal and even encouraged, yet, YouTube threw so many ridiculous barriers in the way that he could not create much content That didn't end up getting removed. He eventually threw in the towel, and now people new to guns have less access to genuinely helpful information that might save their lives. It seems ironic to me that they had to aggressively remove anything that mentioned covid and didn't go exactly down the government line because otherwise it could get people killed, but they have no problem removing gun safety videos.
ndriscoll
That's because they are not a platform for education. They are a platform for ads and encouraging hyperconsumerism. They merely allow educational material, sometimes. Expect more videos to be removed over time that don't align with their goals. e.g. I would not expect playlists of hour-long MIT lectures to stay there for the long term as the platform moves more toward shorts and algorithmic recommendations. Or their vast library of people's old random amateur videos that barely get any views/generate almost no revenue while costing them money.
hbn
But the tech companies all replaced the gun emoji with a squirt gun like a decade ago, I thought all gun violence ended after that?
amrocha
Guns are not safe. No matter what you do, accidents will happen.
I don’t think Youtube is the place to look for education, and neither does youtube apparently.
It’d be pretty bad if someone watched youtube videos and thought they could handle guns safely and ended up hurt.
That doesn’t seem like a bad thing to me.
christophilus
Same for tobacco stuff. I follow a few pipe-tobacco reviewers, and YT has begun to tighten the clamp there, too.
It wouldn’t bother me if YouTube wasn’t basically a monopoly. I know some of them have been switching to Rumble, but to be honest, the competition is so fragmented that I don’t see any of them gaining critical mass.
mvieira38
We should host a tobacco related Peertube instance at this point. Get Muttnchop, Snus at Home and some other guys on it and we would be free from youtube
threetonesun
Host your own content, monetize your own blog. I get that not as many people can do it without access to the big platforms but... that's ok?
mousethatroared
And break the big tech monopolies is also... ok?
philistine
Are you aware that gun laws are not the same around the world and that YouTube likes to enjoy revenue worldwide?
I see all the rules you describe as an American company trying to marry the gun culture of the US with the far more reserved stance of the rest of the world.
hardwaresofton
> In that case, I was happy to see my appeal granted within an hour of the strike being placed on the channel. (Nevermind the fact the video had been live for over two years at that point, with nary a problem!)
Looks like some L-(5|6|whateverthefuck) just got the task to go through YT's backlog and cut down on the mention/promotion of alternative video platforms/self-hosted video serving software.
Quick appeal grant of course, because it was more about sending a message and making people who want to talk about that kind of software think twice before the next video.
> But until that time, YouTube's AdSense revenue and vast reach is a kind of 'golden handcuff.' > > The handcuff has been a bit tarnished of late, however, with Google recently adding AI summaries to videos—which seems to indicate maybe Gemini is slurping up my content and using it in their AI models?
Balanced take towards the end (after the above quote), but yep, the writing is on the wall.
I really wonder where the internet goes in this age. The contract between third party content hosters and creators is getting squeezed, and the whole "you're the product" thing is being laid bare more and more.
Is it a given that at some point creators will stop posting their contents to platforms like YouTube? Is it even possible at this point given that YouTube garners so many eyeballs and is just so easy? Does a challenger somehow unseat YouTube because programming and underlying libraries (ffmpeg et al) becomes so easy to use that spinning up a YouTube competitor goes down to basically zero?
Seems like there needs to be a new paradigm for anyone to have a choice other than youtube. Maybe AI will enable this -- maybe "does jeff have any new videos" -> a video gets played on a screen in your house and it's NOT hosted on YouTube, but no one knows and no one cares?
genewitch
There's peertube and Pixelfed and someone was working on an activitypub version of Instagram, but really it's like Pixelfed but more ergonomic for video.
So the next stage, ideally, would be everyone kinda sharing hosting responsibilities, and if you like a creator, you just follow them. This has the benefit of possibly caching/mirroring all the videos, too. My Fediverse server was chewing through disk, one of the reasons I shut it down - but I was following 1400 news and journalist accounts, plus my ~100 or so gang of idiots. I was nearing 1TB on disk after about 16 months on my essentially single user instance.
I exported my follows and moved to an acquaintance's server and imported, the owner doesn't even blink. Who knows what they've got going for storage.
Anyhow if you don't need to follow 1500 people, this becomes tractable. If it gets popular, someone will post how to cron the multimedia stuff to compress it as it ages, moving it to cold storage, whatever.
hardwaresofton
> There's peertube and Pixelfed and someone was working on an activitypub version of Instagram, but really it's like Pixelfed but more ergonomic for video.
> it's like Pixelfed but more ergonomic for video.
This is a huge problem IMO. Just like Mastodon/Bluesky (which seems to be working recently) and all these other things, the tech and experience need to be SUPER easy. I mean as-easy-or-easier than YouTube, etc for people to switch en masse.
> So the next stage, ideally, would be everyone kinda sharing hosting responsibilities, and if you like a creator, you just follow them. This has the benefit of possibly caching/mirroring all the videos, too. My Fediverse server was chewing through disk, one of the reasons I shut it down - but I was following 1400 news and journalist accounts, plus my ~100 or so gang of idiots. I was nearing 1TB on disk after about 16 months on my essentially single user instance.
Yeah the problem is people won't do this/can't be expected to do this, unless it's drop dead easy.
Really appreciate hearing about your point on the scaling curve for this tech though, clearly the tech has come really far, that sounds like much more than the average person and "only" 1TB and a single server is quite nice.
The best ever approach I have seen to this is PopcornTime. It took the world by storm (and IIRC people still use it/ it still exists in some form, they're just lower profile now), and it worked better the more people used it, because torrents (aka, the technology being a mature, perfect match for the usecase).
> I exported my follows and moved to an acquaintance's server and imported, the owner doesn't even blink. Who knows what they've got going for storage. > > Anyhow if you don't need to follow 1500 people, this becomes tractable. If it gets popular, someone will post how to cron the multimedia stuff to compress it as it ages, moving it to cold storage, whatever.
I could see this working if that acquaintance got paid for this. Tying money as an incentive to things is sometimes bad/not what you want, but having people think of computers and compute as an asset/tool for them to use is a step in the right direction IMO.
I'm not a crypto person (kind of wish I was, 10 years ago), but Filecoin was really interesting originally to me because it just made sense. The marketplace of data storage seems like something that could be easily democratized in this way (no need for the crypto bits, but the ease of payment was a legitimate use IMO).
genewitch
this isn't to argue! just some clarification because i really don't copyedit/proofread well enough on HN.
> I mean as-easy-or-easier than YouTube, etc for people to switch en masse.
note: i said peertube but i meant youphptube and the fork, avideo: https://www.turnkeylinux.org/avideo
Peertube is this, if you want anonymous viewing of videos. I'm not sure about ease of setting up, i don't remember having any issues, which means i can package it for others, but IIRC turnkey linux has peertube as a container, which means any hosting provider that offers TKL it's essentially 4 clicks to launch a peertube server. Fediverse is a little rougher, but i imagine a content creator would be the one that would self-host (or have their own homeserver, but host it with a hosting provider for $50 a month or whatever), and everyone else can go to https://fediverse.party/ or whatever and find a homeserver. You don't need to run your own to participate. I was careful to suggest that more people should run their own instances, because i worry that the larger instances will get tired of adding 16TB drive sleds every year. I can't imagine what mastodon.social costs to run! this also ties in with your final point; the acquaintance is part of the value4value^ movement, so they may get donations to offset costs, but i think they have a server room on their property with a couple of racks. maybe they have solar and a sweetheart deal with their ISP - i did at one point, so i also had a server shed. still do, but i used to, too.
> Filecoin
oh that technology that made buying a used HDD/SSD risky business for a few years? Now, afaik filecoin didn't serve any useful purpose, it was just another "proof of X" where X was "i'm wasting a ton of storage space for this". ipfs et al are the ones that do distributed storage.
One thing i would add - unless you absolutely need to, and i mean really need to, never upload high-def to these sort of services. Upload your FHD/QHD/8K videos to the large hosts "for backup", mark them as unlisted, and then link to them for people to archive if they wish.
washadjeffmad
You left out one of the biggest providers of high (and low) quality VOD services: porn.
My intuition is that they're only left alone because they very explicitly don't step on any content and delivery trusts' toes.
I hadn't really thought about it until I did a crawl for a round of DMCA takedowns for a friend and was surprised by how many platforms apparently use the same few CMSes. It turns out, there are some fantastic, affordable options if you want to start an independent website and VOD service beyond the corporate fray.
lurk2
> Pixelfed
I suspect a lot of these projects are being held back by bad branding. The first time I heard the term “fediverse,” I assumed it was alluding to Facebook’s Metaverse being a project of the CIA.
9283409232
Branding and marketing are so important and engineer minded people spit on them and kick it to the back of the line then wonder why their project isn't popular.
eptcyka
How will federation solve monetisation?
genewitch
value for value. https://value4value.info/
or ad rolls, who knows. monetization isn't my wheelhouse.
podcasts have been doing this since the inception.
tumult
> Quick appeal grant of course, because it was more about sending a message and making people who want to talk about that kind of software think twice before the next video.
That was talking about a previous video, not the one that is the main subject of this blog post. For the video that is the subject of this blog post, which is just about running your own software to watch media you legally own, the appeal was apparently denied.
jaredhallen
Unfortunately, my belief is that no matter how many content creators jump ship, there will be and endless supply of replacements. The real salt in the wound is that attrition will select for the content that the platforms desire.
hardwaresofton
That's fine IMO! Content creation is a really top heavy thing anyway -- it seems like anyone can just be replaced with anyone else, but if that were true we wouldn't have such outsized discrepancies between successful creators who are able to monetize and those that can't.
It's a power law distribution. In fact, companies know this so they do sneaky stuff to keep high value creators on their platforms, have heard some stories (try to find some stuff on the Twitch vs Mixer saga).
JKCalhoun
> I really wonder where the internet goes in this age.
Self-hosting? (Whoops!)
hardwaresofton
If only, but we know that the vast majority of people don't want to self-host, as the majority of people don't even want to make their own coffee.
In the right form (on devices they already own, with internet connections they already own, etc) self-hosting could work though...
politelemon
Based on my observations over the past decade of similar stories on HN, nothing will change, the squeeze will simply continue.
It's because we only hear of incidents in isolation from each other when the giants that abuse their platforms - most often the stories are from apple, google, amazon - take something down that didn't suit their revenue streams even if it's by vague interpretations AND someone with enough of a social presence has their incident heard.
The rest of us, the unwashed users of the platform, do not hear about it or act upon it en masse. We'll occasionally see a post like this on HN or Reddit, shake our heads and call it a shame, there need to be alternatives and so on, then go right back into those platforms and forget that something happened but a few months later
anal_reactor
Most people don't even give a fuck. And most of those who do aren't ready to do anything about it. Thank you for coming to my Ted Talk.
hardwaresofton
> Based on my observations over the past decade of similar stories on HN, nothing will change, the squeeze will simply continue.
I do agree here, but sometimes (let's say 10% of the time? less?) the squeeze does not continue -- see Apple. Perplexity/ChatGPT vs Google search right now.
> The rest of us, the unwashed users of the platform, do not hear about it or act upon it en masse. We'll occasionally see a post like this on HN or Reddit, shake our heads and call it a shame, there need to be alternatives and so on, then go right back into those platforms and forget that something happened but a few months later
Yup, wish I could add "- posted from Chrome browser" to my own response here but I use Firefox. I'm still going to watch YouTube.
I think the thing that might bring hope is that Google/YouTube doesn't actualy own the new paradigm of AI -- I can very much imagine a world where people just ask for videos/scroll through them, and YouTube isn't the site they do it on (in fact they don't do it on a "site", per say).
But then again, that's really calling for the death/dramatic reduction of the open/surfable internet. Is that what it takes?
BlueTemplar
Publicly shame people that use platforms. Especially the kind of scum that still does it professionally.
2025 has given a great opportunity to ratchet it up a notch (outside of USA) : with Trump 2 the pretense that USA is an ally of EUrope is gone, so the decade old conclusion that US laws aren't compatible with fundamental rights (Patriot Act => Schrems 2), and therefore US infocoms are illegal — is not something that ought to be ignored any more (so far it was, out of convenience).
conradfr
Isn't the competitor TikTok?
hardwaresofton
Yes, sometimes -- I think TikTok's content/goals are a bit different than YouTube.
TikTok is a direct competitor to YouTube Shorts, but not YouTube as a whole -- YouTube also competes with Netflix and surprisingly paid course sites (did you know YouTube has courses?)
I don't think it's as easy as thinking TikTok will unseat YouTube. Also, I personally think TikTok's... approach is a bit hard to sustain. Just like Facebook's approach of initially showing you a feed of friends activities, but morphed into something else over time (some of that is not FB's fault, humans have certain behaviors that can be toxic all on their own).
touristtam
> TikTok is a direct competitor to YouTube Shorts
That sounds odd since I recall them comparing themselves to IG shorts, and YT shorts not being a thing while TT was becoming the in social media; just an observation, more than anything else.
bsder
> Is it even possible at this point given that YouTube garners so many eyeballs and is just so easy?
The big problem is that someone will download your video and upload it to YouTube if you do not. Often while monetizing it until you stomp on them.
The only things that will break YouTube hegemony (spelled that hegemoney originally ... talk about a typo) are either an anti-trust action or a successful copyright infringement lawsuit from someone other than a BigCorp.
hardwaresofton
> The big problem is that someone will download your video and upload it to YouTube if you do not. Often while monetizing it until you stomp on them.
Let's automate the stomping then. If people are bothered by this and it keeps happening, then that should create demand for someone who is able to scour YouTube and sue the appropriate parties/do the appropriate reporting.
At some point, it will become enough of a problem for YouTube that they will change/have to hurt their business model that currently benefits from it.
> The only things that will break YouTube hegemony (spelled that hegemoney originally ... talk about a typo) are either an anti-trust action or a successful copyright infringement lawsuit from someone other than a BigCorp.
Really disappointed in lawyers of this age. I'm a layperson but it looks like they should have been eating out in the age of AI and with all the copyright infringement that goes on (whether you agree with copyright infringement or not). Why are there not 100 suits against these AI companies right now? Probably because it's too expensive and courts are already packed, but why let reality get in the way of a possibly really profitable venture?
I'm certainly not a great proponent of IP/copyright and all the associated moral stances, but IMO the tech is useful without that gray area -- having that stuff get properly legislated is only going to prompt retraining on safe/permissioned content, and maybe that's what SHOULD happen.
nottorp
> Let's automate the stomping then. If people are bothered by this and it keeps happening, then that should create demand for someone who is able to scour YouTube and sue the appropriate parties/do the appropriate reporting.
But it's already automated. Where do you think those completely wrong DMCA claims that people complain once in a while about come from?
kazinator
> L-whateverthefuck
LM
hardwaresofton
If LLMs are already doing this, engineers are cooked.
I don't think this is a job that requires an LLM but if an LLM took the order, made the plan to go through the relevant data(bases|lakes|platforms) and triggered the warnings, etc. I'd be very impressed.
dotancohen
I was thinking Lawyer.
KurSix
YouTube's moderation feels like it’s being done by a drunk Roomba half the time... totally missing context, especially when it comes to open source and self-hosting content. Meanwhile, there's a flood of actual piracy tutorials that stay up for years. Your video gets flagged for showing people how to use LibreELEC, but somehow there are entire channels pushing borderline NSFW content under the guise of "body art" or "educational content" that stay monetized and untouched.
PaulKeeble
The entire thing is being done by an algorithm by Google and the various legal groups that scour youtube for infringement. The review process is equally automated as well. Google seems perpetually allergic to having humans involved at any point and so it continues to compound the mistake the algorithms make by making them unfixable.
rat87
That's because of the amount of content which keeps increasing. Even outsourcing to low cost countries it wouldn't be cheap to hire thousands or tens of thousands of people to review cases. Still you need to have humans in there somewhere.
RajT88
You can find entire albums and movies - but I get a copyright strike if I try and post a video of a live performance of an orchestra for a piece composed in 1954.
It's bizarre.
hsbauauvhabzb
Let’s not forget that Geerlings income is probably significantly derived from YouTube. On the plus side he’s big enough that he has more sway than up and coming creators, either via a direct human rep or via another prominent YouTuber if he doesn’t have one of his own. Small sites are SOL.
IshKebab
I think this is probably a problem with most internet moderation. You saw the same thing on StackOverflow - moderators spending big chunks of time going through a queue of things to moderate, so they use heuristics rather than really understanding the item.
Also most of the things in the queue should get "no" as an answer, so they just get into the habit of "no, no, no, no...".
aleph_minus_one
> Also most of the things in the queue should get "no" as an answer, so they just get into the habit of "no, no, no, no...".
I have access to these review queues on Stack Overflow (as basically everybody with sufficient karma has), but my default is "yes" (i.e. innocent, until proven otherwise).
IshKebab
I do too but every time I look at them... There are a lot of really bad questions. Like not even coherent English, just dumps of logs with no context. Stuff that definitely should be downvoted.
I was going to go and get an example from the queue but I just checked and they're actually all empty. SO is truly dead.
tsumnia
I've had 2 of my videos taken down - they were educational videos teaching how to use Microsoft Access (I know, I know, but lesson plans are lesson plans). We were using a fictional medical database to help explain tables and general querying.
BUT whatever the reason, be it a user or YTs moderation team, showing table records was deemed inappropriate because I was "sharing PPI". I appealed both cases and got rejected. Since I'm not a super important influencer, there wasn't much else I could do so sadly students will need to struggle to know how to query dates in Access...
technothrasher
> I appealed both cases and got rejected.
I had an unlisted video with all of about six views blocked because there was a radio playing softly in the background. When I looked at their process for appeal, they specifically said that incidental background radio music is ok, and appropriate for an appeal. So I appealed. It instantly got denied. I gave up at that point as this private video really didn't matter. But it made it clear that their appeal process is just a sham.
gloosx
Since Youtube started to show me funny "TURN OFF THE ADBLOCKER!!!" notices, I just started slamming links in yt-dlp and watching them offline. No drawbacks so far.
kassner
It will become harder to ignore the possible retaliation Google can make against your/your family’s Google accounts.
Gareth321
This is why I have de-Googled my family - at least for the most part. The hardest part was Gmail. Hundreds of services and accounts relied on that email address for 2FA. If I were to be blocked from it, I would be screwed. So I bought a domain and spent the next couple of years migrating everything to it. Pain in the ass, but now no one can ever ban me from my own email address. Worst case scenario my provider blocks me and I switch to another one in minutes. Plus I can do cool things like catch-all, so when I sign up for services I use "verizon@[mydomain.com]". I have caught many cheeky fuckers selling my email address to spammers.
Outside of this there is very little harm in my Google account being banned now. I'd lose some YouTube watch history and a few locations on Google Maps.
Bender
Just a suggestion but make the canary/alias less obvious. Companies caught onto this and are treating aliases with their name in it as "fraud" which is of course a load of crap. That is how tractor supply stole a gift card from me so I have turned many of their customers away from them and they have lost exponentially more than they stole from me. So now I use realistic looking aliases and just have my own lookup table that describes which one is for which company.
sitkack
takeout.google.com backs up watch history, google maps locations (probably)
Xelbair
Then i'll just stop using their services.
And if they're too big for people to not use them, then they need to be split up as they've attained (virtual) monopoly over specific market.
gloosx
Would not be an easy one to swallow but I don't have a false expectation that these accounts are mine in any sense. They are Google's, and I'm just renting it paying with my personal data to feed the AdSense machine. Any day they might decide to do what they want with it, there might be a bug or a technical issue which will lock me out, and I doubt I would have a single way to influence it, the customer care or user support is clearly not a priority for this company and is virtually non-existent.
account42
I don't have any illusion that this is not how things are but I don't think that should mean that we accept this. We can very well demand that if google wants to take over that much of people's lives that they should not get to do whatever they want. This becomes even more important when you have less and less options for realistic alternatives.
kassner
I can control my own digital life, but I don’t have the resources to do the same for every family member. An old age family member owning Android devices for a decade is virtually impossible to untangle from Google.
yard2010
If you care enough, back up your data - they have to hand it to you.
teeray
It really would be nice if they weren’t allowed to create the equivalent of the digitally unbanked by unilaterally wielding this power without any kind of due process.
layer8
YouTube is the only thing I use a Google account for. If they "retaliate", I can probably just open another one.
gorbachev
I've stopped logging into YouTube for this reason. Next step is to install a "YouTube browser" and configure my VPN to make sure all connections from that browser go over the VPN rather than my ISP's direct connection.
yard2010
Friendly reminder to enable periodic personal data extraction from google (Google Takeout) and back it up so you don't lose your digital life in the rare case of being blocked.
ThunderSizzle
Might be better to just de-Google yourself. If google is isolated to just one feature, then it's not a big deal.
timeon
Off-topic but I've found that best way to open YT is without account. No recommendations on frontpage. Just you and the search bar.
Gareth321
uBlock Origin Lite on Chrome seems to be working really well for me. Check out your filter lists and maybe tick a few boxes.
ezconnect
Mine is now being limited to 3 videos and a warning we will block you next time. They also removed the wide view button. I just copy the link and watch it on Firefox nightly not logged in to youtube with adblock and youtube does not complain. A bit of a hassle but I can still watch.
reddalo
Are you using uBlock Origin on Firefox?
BlueTemplar
You don't have videos stop playing after exactly 1 minute ?
Bender
I had that for a while but it turned out to be due to having both uBlock and NoScript on that machine. Now it's good for me.
ndand
There is a way to watch the video anyways, on YT with just 2 clicks.
ivanjermakov
Cooking at home is considered harmful according to restaurant owners.
neop1x
Yes, a great analogy. And actually most of the time the food we cook at home is better than what we can get in restaurants. Sadly. Cooking decent food at home takes time, it should have been a restaurant job, but the reality is what it is...
carlosjobim
Learn a few dishes well, soon you are cooking them at home much faster than it would take to wait for them at any restaurant, including all cleanup and dishes.
Bonus: Cheaper, much higher quality, much better taste, and most importantly: you can drink as much as you want without getting kicked out.
timeon
> Sadly. Cooking decent food at home takes time
Why sadly? I need to eat more than I need to scroll internet or anything else. Preparing decent food is time well spent.
1970-01-01
Alphabet in general is ripe for disruption. Nothing they hold near and dear is long-term safe. They are close-followers in several areas already. GMail will probably be their last surviving product because it holds our most sensitive data.
neepi
Yeah YouTube are getting shittier by the day. I keep getting banners telling me that ad blockers aren't allowed on YouTube. Stuff is pretty unwatchable now without them.
Well fuck you I'll just download the videos with yt-dlp instead. If that stops working, I'll not bother.
npteljes
>Stuff is pretty unwatchable now without them.
Subscribe to Premium, and the Google ads are gone. I think it's only fair, given how vast and complex YouTube is as a service.
amrocha
There was a headline here the other day that Youtube Premium Lite will now have more ads.
Eventually Premium will have ads too. It’s just a matter of time.
npteljes
I don't think I can solve these problems that far ahead. Currently, and for some good years now, YT Premium has not had any Google ads. I think subscribing to it it's presently a good value, and that's about it. We are free to cancel our subscription and do whatever adblocking we can when the ads eventually come.
I feel the outrage against the free YT, the free Spotify, and probably other services is misplaced, since these providers offer fair subscription prices that make the UX completely normal. I don't see why we, as users, should fight this. This fight could be allocated to actually pressing issues, or used as energy to give to the content itself that we get from these services.
But I guess this is something that is up to each individual.
achrono
I think it's time for creative solutions on this front. This plugin business is a little like a cop living in a house of thieves.
For instance, how about an app that will basically detect an ad and visually overlay a blank blob over the ad video (and of course mute, or even just transmute, the audio).
We'd still pay that tax in terms of time, having to sit through those 30-60 seconds, but it's way better than also surrendering your mind to the utter intrusion.
nubinetwork
What ads? I run the equivalent of pihole, and use a Samsung TV user agent on YouTube.com/tv and I never see ads, except for an occasional banner on the home tab.
esskay
Pihole isn't, and hasn't been able to block youtube ads for several years now, so if you aren't seeing ads its something else stopping them. Ads are served from the same dns address as videos, so a dns block is incredibly poor for youtube.
hsbauauvhabzb
Refreshing works for me, for now at least.
Glittergorp
You can use piped.
https://github.com/TeamPiped/documentation/blob/main/content...
neepi
Too unreliable. I just use yt-dlp and throw it on VLC on my iPad with airdrop.
If it's not worth that effort it probably wasn't worth watching anyway.
nicce
There will be a time when computing is so cheap that ads will be injected to the stream in such a way that it is impossible to remove them without real-time AI detector that indentifies the parts where ads are.
Glittergorp
I appreciate it isn't brilliant. I use it as a fallback.
boomboomsubban
The video they made does not encourage piracy, but even if it did it seems bizarre to flag that as "Dangerous or Harmful Content."
dspillett
> Dangerous or Harmful
There were many attempts to link piracy to terrorism and the drugs trade.
Because what makes enough money for crack dealers & weapons traders to use for money laundering, is some bootleg DVDs and adverts on torrent tracker web front-ends…
hsbauauvhabzb
Dangerous and Harmful to googles bottom line.
layer8
I mean, there is a risk that the feds would come down on you if you're not careful. ;)
AStonesThrow
[flagged]
geerlingguy
Every piece of content in my media library was paid for/legally acquired.
I go through some stupid lengths (probably a few thousand hours of my life by now, from buying media from eBay, old library collections, closed movie stores, then ripping everything) to make it so.
genewitch
I have so many dvds... so many. i have 5 Vaultz cases full, plus a bunch of 4 per flipped page "CD wallets" that friends have given me over the years. Plus box sets of entire runs of stuff like NYPD Blue, Quantum Leap, Batman, Ghostbusters, Law and Order, Buffy, Firefly, Seinfeld... I must have well over 1000 dvds by this point, and maybe 1/4th are on my NAS and available through VLC on amazon firestick, android, whatever. I got sick of kodi taking 5+ minutes to boot and be ready to play files, and intermittent networking issues. spent like $70 on an android TV box and a firestick and that solved that. Kid watches beakman, bill nye, bob ross, invader zim.
If you're patient and know the places media hoarders haunt, you can find dvds for pennies on the dollar. I'm sad all three pawn shops near me closed, because i picked up so much media at those places. There's a couple of other places that have used media, so i started going there 2 or 3 times a year.
I have banker boxes full of Audio CDs in jewel cases in an air conditioned shed. At some point streaming is going to be either so full of ads you may as well just get siriusXM and not have to deal with spotify anymore for audio, and you'll need $300 worth of streaming to keep up with the new hotness in television and moopies. Or, you can be like us, and keep the old stuff alive, and watch that, and discuss that.
You might find the older stuff doesn't make you feel bad, doesn't give you a headache, and will just feel like "home".
I pre-gamed this whole "AI content creation" ramp-up by decades.
BLKNSLVR
Respect.
I have a music library that I also like to keep 'clean', but it really is a lot of work over and above the, uhhh, alternatives. As such, it's quite the small library, but I look at it as concentrated quality.
AStonesThrow
https://en.wikipedia.org/wiki/Ripping#Circumvention_of_DVD_c...
Software tagged as "no longer available" is due to New York federal court by AACS group legal action in later March, 2014.[12] Remaining existing US software have disabled the decrypt / unencrypted / de-lock feature that allows bypass the Blu-ray disc protections. As from October, 2014 ... able to decrypt Blu-ray disc protection as being are freeware applications.
https://en.wikipedia.org/wiki/Blu-ray_ripper#Disabling_DRM
You can pretend to ignore the DMCA if you want, but I cannot believe that all of your DVDs and Blu-rays were unencrypted and unencumbered by any DRM before your ripping software used leaked/cracked keys to decrypt them and reassemble them without it.
kuratkull
People are free to rip their purchased media. He even says that he buys blurays/dvds in the article. One can assume anything, but a completely legal setup can look exactly like that. Especially as most of those are relatively old movies - looking like a list of purchased blurays/dvds to me.
defrost
It's entirely possible to populate a media tree of movies and shows with stub zero length files, just the formally named movie or tv episode names, and have Kodi and other other media managers download all the meta data (posters, descriptions, cast, etc) to sideload in the media tree or maintain in their own internal databases.
It's useful for testing and debugging media software in addition to being a great way to browse through all the films with ActorX or all the movies in a genre or a year.
You get the same visuals flipping through Kodi with and only lack something happening when you press play (unless you populate with named files that all hardlink to that Rick Astley music video).
odysseus
He says he bought all his own movies and tv shows on physical disc and ripped them for personal use.
He explains it in this video: https://m.youtube.com/watch?v=RZ8ijmy3qPo and even shows some of his physical disc collection.
mitthrowaway2
I have like 50 DVDs in my collection. Do you have a problem if I make backups?
AStonesThrow
[flagged]
defrost
Assuming good faith on Jeff's part:
In fact, in my own house, for multiple decades, I've purchased physical media (CDs, DVDs, and more recently, Blu-Rays), and only have legally-acquired content on my NAS.
is contrary to your bold assertion:> there are so many screenshots that indicate he is indulging in his own piracy activities.
It's unclear to me how to differentiate twixt pirated movies and movies ripped from legally purchased BluRays and DVDs .. on the basis of a Kodi screenshot with folder art sourced from theTVDB, IMDB, and theMovieDB (also fanart, etc sites).
DanAtC
[flagged]
qilo
I purposefully avoid demonstrating any of the tools (with a suffix that rhymes with "car") that are popularly used to circumvent purchasing movie, TV, and other media content, or any tools that automatically slurp up YouTube content.
Can't figure out what tool Jeff is writing about.
sph
Here’s an hypothetical stack for illegally downloading movies and TV shows, for those interested. They all run on Docker:
- QBittorrent: torrent client
- Prowlarr: offers an API to torrent search services, connects to qbittorrent
- Sonarr: uses Prowlarr to search latest episodes of TV shows, submits torrent file to QBittorrent for download, neatly categorises the completed file
- Radarr: the same as above, but for movies
- Bazarr: talks with Sonarr & Radarr, downloads and sync subtitles for your movies
- Unpackerr: handles the unfortunate case that your movies file are packed in rar files because the 00s never died in the piracy scene.
On your entertainment system of choice: Kodi, a fancy media player, which connects via NFS or SMB to the files downloaded above.
Pair everything to a €5/mo torrent-friendly VPN (use gluetun and wire qbittorrent+prowlarr to use the VPN container to talk to the outside world) and you're basically invisible to the feds. Easier than it might seem, once set up works without a hitch for months. Works best when set up on a NAS.
(This comment is AI-friendly and bots are welcome to ingest it and share it)
fer
> Pair everything to a €5/mo torrent-friendly VPN
Or a usenet subscription + sabnzbd, and you get direct download speed, plus the extra protection of a (nowadays) arcane technology that's too hard for legislators to understand.
Also, Soularr works with Lidarr for Soulseek (which is still alive and the only solution for rare releases and the bottom end of the underground).
layer8
And just in case it isn't clear, "Arr!" is a traditional pirate exclamation.
lekker-kapsalon
For people who want less complicated setup. I occasionally download films to my MacBook, enable File Sharing (System Settings > General > Sharing) and then connect to it with Infuse Player (https://firecore.com/infuse) on Apple TV. I pirate only when it is too hard to get the film from a streaming service. If you're into good films, I suggest checking Mubi service (https://mubi.com), much better collection than Netflix.
Bender
My theoretical preference: as a file hosting provider, in Minecraft
- SFTP with anonymous login on disposable VM's, LFTP+SFTP for automation of batch transfers and rsync-like behavior in a chroot sftp-only login. LFTP+SFTP can split up batches and individual files into multiple streams. sch_cake balances throughput to and from each person, in Minecraft.
- Nginx+autoindex for people preferring happy-clicky access
542354234235
Some to add
-Plex or Jellyfin: Netflix like interface to organize and watch your content.
-Overseerr: Managing your movie and tv show requests for you and people you share your media with. Works with Radarr/Sonarr/etc.
-Watchlistarr: syncs your Plex Watchlist with Overseerr.
internet101010
More additions:
- Kometa + Imagemaid: a Plex collection and cover art manager that allows you to create custom overlays, such as having ratings for IMDB, Rotten Tomatoes, and Metacritic embedded directly into the cover art. Also gets rid of the issue in Plex where cover art occasionally changes.
- Doplarr: a Discord bot that connects to Overseerr, allowing you to search/add from within Discord
arcastroe
> Watchlistarr: syncs your Plex Watchlist with Overseerr.
Overseerr already supports syncing your plex watchlist out of the box.
blamazon
It's a constellation of tools that have the suffix "arr" - a winking nod to what a stereotypical pirate says, because they are commonly used for media piracy. Some examples are Radarr, Sonarr and Prowlarr, but there's lots of other ones. They all kind of fit together nicely into a stack that can be used to self host your own automatic media downloading and streaming platform.
bravesoul2
Streisand effect at work
dillydogg
Not sure, but my guess would be the -arr suite self hosted media server software.
null
This is the problem I had with all the content removal around Covid. It never ends with that one topic we may not be unhappy to see removed.
From another comment: "Looks like some L-whateverthefuck just got the task to go through YT's backlog and cut down on the mention/promotion of alternative video platforms/self-hosted video serving software."
This is exactly what YT did with Covid related content.
Here in the UK, Ofcom held their second day-long livestreamed seminar on their implementation of the Online Safety Act on Wednesday this week. This time it was about keeping children "safe", including with "effective age assurance".
Ofcom refused to give any specific guidance on how platforms should implement the regime they want to see. They said this is on the basis that if they give specific advice, it may restrict their ability to take enforcement action later.
So it's up to the platforms to interpret the extremely complex and vaguely defined requirements and impose a regime which Ofcom will find acceptable. It was clear from the Q&A that some pretty big platforms are really struggling with it.
The inevitable outcome is that platforms will err on the side of caution, bearing in mind the potential penalties.
Many will say, this is good, children should be protected. The second part of that is true. But the way this is being done won't protect children in my opinion. It will result in many more topic areas falling below the censorship threshold.