A major AI training data set contains millions of examples of personal data
94 comments
·July 30, 2025djoldman
Just to be clear, as with LAION, the data set doesn't contain personal data.
It contains links to personal data.
The title is like saying that sending a magnet link to a copyrighted torrent file is distributing copyright material. Folks can argue if that's true but the discussion should at least be transparent.
yorwba
I think the data set is generally considered to consist of the images, not the list of links for downloading the images.
That the data set aggregator doesn't directly host the images themselves matters when you want to issue a takedown (targeting the original image host might be more effective) but for the question "Does that mean a model was trained on my images?" it's immaterial.
kazinator
When the model is trained, are the links not resolved to fetch whatever the point to, and that goes into the model?
Secondly, privacy and copyright are different. Privacy is more of a concern with how information is used than getting credit and monetization for being the author.
anonymoushn
no, normally your training pipeline wouldn't involve running bittorrent
kazinator
If the training set contained BitTorrent magnet links to the desired information (e.g. images whose pixels are to be trained on, then, yes, it would have to.
Upthread it was mentioned that the training data representation contained links to material; magnet links were mentioned in passing as an example of something supposedly not violating copyright. It wasn't stated that training data contained magnet links. (Did it?)
duskwuff
That's a distinction without a difference. Just as with LAION, anyone using this data set is going to be downloading the images and training on them, and the potential harms to the affected users are the same.
Frieren
> The title is like saying that sending a magnet link to a copyrighted torrent file is distributing copyright material.
I interpret that the article is about AI being trained on personal data. That is a big break of many countries legislation.
And AI is 100% being trained in copyrighted data too. Breaking another different set of laws.
That shows how much big-tech is just breaking the law and using money and influence to get away with it.
bearl
Links to pii are by far the worst sort of pii, yes.
“It’s not his actual money, it’s just his bank account and routing number.”
djoldman
A more accurate analogy is "it's not his actual money, it's a link to a webpage or image that has his bank account and routing number."
bearl
My contention is that links to pii are themselves pii.
A name, Jon Smith, is technically PII but not very specific. If I have a link to a specific Jon Smith’s facebook page or his HN profile, it’s even more personally identifiable than knowing his name is Jon Smith.
null
cheschire
I hope future functionality of haveibeenpwned includes a tool to search LLM models and training data for PII based on the collected and hashed results of this sort of research.
satvikpendem
This is all public data. People should not be putting personal data on public image hosts and sites like LinkedIn if they did not want them to be scraped. There is nothing private about the internet and I wish people understood that.
boie0025
While I agree with your sentiment, there's a pretty good chance that at least some of this is, for example, data that inadvertently leaked while someone accidentally exposed an automatic index with Apache, or perhaps an asset manifest exposed a bunch of uploaded images in a folder or bucket that wasn't marked private for whatever reason. I can think of a lot of reasons this data could be "public" that would be well beyond the control of the person exposed. I also don't think that there's a universal enough understanding that uploading something to your WordPress or whatever personal/business site to share with a specific person, with an obscure unpublished URL is actually public. I think these lines are pretty blurry.
Edit: to clarify, in the first two examples I'm referring to web applications that the exposed person uses but does not control.
andsoitis
> There is nothing private about the internet and I wish people understood that.
I don’t know that that is useful advice for the average person. For instance, you can access your bank account via the internet, yet there are very strong privacy guarantees.
Concur that it is a safe default assumption what you say, but then you need a way for people to not now mistrust all internet services because everything is considered public.
satvikpendem
I'm more so talking about posting things consciously on some platform, not accessing.
pera
> This is all public data
It's important to know that generally this distinction is not relevant when it comes to data subject rights like GDPR's right to erasure: If your company is processing any kind of personal data, including publicly available data, it must comply with data protection regulations.
booder1
Legal has in no way been able to keep up with AI. Just look at copyright. Internet data is public and the government is incapable of changing this.
chrisg23
What if someone else posts your personal data on the public internet and it gets collected into a dataset like this?
satvikpendem
How is that not a different story?
malfist
What's important is that we blame the victims instead of the corporations that are abusing people's trust. The victims should have known better than to trust corporations
nerdjon
Right, both things can be wrong here.
We need to better educate people on the risks of posting private information online.
But that does not absolve these corporations of criticism of how they are handling data and "protecting" people's privacy.
Especially not when those companies are using dark patterns to convince people to share more and more information with them.
thinkingtoilet
If this was 2010 I would agree. This is the world we live in. If you post a picture of yourself on a lamp post on a street in a busy city, you can't be surprised if someone takes it. It's the same on the internet and everyone knows it by now.
pnutbutry
[dead]
Workaccount2
I have negative sympathy for people who still aren't aware that if they aren't paying for something, they are the something to be sold. This has been the case for almost 30 years now with the majority of services on the internet, including this very website right here.
exasperaited
People are literally born into that misunderstanding all the time (because it’s not obvious). It’s an evergreen problem.
So you are basically saying you have no sympathy for young people who happen to have not been taught about this, or been guided by someone highly articulate in explaining it.
Is it taught in schools yet? If it’s not, then why assume everyone should have a good working understanding of this (actually nuanced) topic?
For example I encounter people who believe that Google literally sells databases, lists of user data, when the actual situation (that they sell gated access to targeted eyeballs at a given moment and that this sort of slowly leaks identifying information) is more nuanced and complicated.
malfist
That explains why ISPs sell DNS lookup history, or your utility company sells your habits. Or your TV tracks your viewership. I've paid for all of those, but somehow, I'm still the product.
gishglish
Tbh, even if they are paying for it, they’re probably still the product. Unless maybe they’re an enterprise customer who can afford magnitudes more to obtain relative privacy.
keybored
Modern companies: We aim to create or use human-like AI.
Those same modern companies: Look, if our users inadvertently upload sensitive or private information then we can't really help them. The heuristics for detecting those kinds of things are just too difficult to implement.
blitzar
> blame the victims
If you post something publicly you cant be complaining that it is public.
lewhoo
But I can complain about what happens to said something. If my blog photo becomes deep fake porn am I allowed to complain or not ? What we have is an entirely novel situation (with ai) worth at least a serious discussion.
malfist
Sure, and if I put out a local lending library box in my front yard I shouldn't by annoyed by the neighbor that takes every book out of it and throws it in the trash.
Decorum and respect expectations don't disappear the moment it's technically feasible to be an asshole
squigz
> The victims should have known better than to trust corporations
Literally yes? Is this sarcasm? Are we in 2025 supposed to implicitly trust multi-billion dollar multi-national corporations that have decades' worth of abuses to look back on? As if we couldn't have seen this coming?
It's been part of every social media platform's ToS for many years that they get a license to do whatever they want with what you upload. People have warned others about this for years and nothing happened. Those platforms' have already used that data prior to this for image classification, identification and the like. But nothing happened. What's different now?
Anonbrit
A hidden camera can make your bedroom public. Don't do it if you don't want it to be on pay-per-view?
satvikpendem
That is indeed what Justin.tv did, to much success. But that was because Justin had consented to doing so, just as anything anyone posts online is also consented to being seen by anyone.
dlivingston
Your analogy doesn't hold. A 'hidden camera' would be either malware that does data exfiltration, or the company selling/training on your data outside of the bounds of its terms of service.
A more apt analogy would be someone recording you in public, or an outside camera pointed at your wide-open bedroom window.
windexh8er
We've got plenty of examples where Microsoft (owner of LinkedIn) is OK with spying on their users using methods akin to malware.
People who've put data on LinkedIn had some expectation of privacy at a certain point. But this is exactly why I deleted everything from LinkedIn, other than a bare minimum representation that links external to my personal site, after they were acquired.
Microsoft, Google, Meta, OpenAI... None of them should be trusted by anyone at this point. They've all lied and stolen user data. People have taken their own lives because of the legal retaliation for doing far less than these people hiding behind corporate logos that suck up any and all information because they've been entitled to not have to deal with consequences.
They've all broken their own ToS under an air of: OK for me, not for thee. So, yes, the hidden camera is a great analogy. All of these companies, and the people running them, are cancers in and on society.
dpoloncsak
Does this analogy really apply? Maybe I'm misunderstanding, but it seems like all of this data was publicly available already, and scraped from the web.
In that case, its not a 'hidden camera'...users uploaded this data and made it public, right? I'm sure some were due to misconfiguration or whatever (like we see with Tea), but it seems like most of this was uploaded by the user to the clear web. I'm all for "Dont blame the victims", but if you upload your CC to Imgur I think you deserve to have to get a new card.
Per the article "CommonPool ... draws on the same data source: web scraping done by the nonprofit Common Crawl between 2014 and 2022."
T3RMINATED
[dead]
jeroenhd
AI and scraping companies are why we can't have nice things.
Of course privacy law doesn't necessarily agree with the idea that you can just scrape private data, but good luck getting that enforced anywhere.
1vuio0pswjnm7
archive.is is (a) sometimes blocked, (b) serves CAPTCHAs in some instances and (c) includes a tracking pixel
One alternative to archive.is for this website is to disable Javascript and CSS
Another alternative is the website's RSS feed
Works anywhere without CSS or Javascript, without CAPTCHAs, without tracking pixel
For example,
curl https://web.archive.org/web/20250721104402if_/https://www.technologyreview.com/feed/
|(echo "<meta charset=utf-8>";grep -E "<pubDate>|<p>|<div") > 1.htm
firefox ./1.htm
To retrieve only the entry about DataComp CommonPool, curl https://web.archive.org/web/20250721104402if_/https://www.technologyreview.com/feed/
|sed -n '/./{/>1120522</post-id>/,/>1120466</post-id>/p;}'
|(echo "<meta charset=utf-8>";grep -E "<pubDate>|<p>|<div") > 1.htm
firefox ./1.htm
1vuio0pswjnm7
If using a text-only browser that does not process CSS or run Javascript, 100% of the article is displayed
pera
Yesterday I asked if there is any LLM provider that is GDPR compliant: at the moment I believe the answer is no.
thrance
Mistral's products are supposed to be at least, since they are based in the EU.
pera
I am not sure if Mistral is: if you go to their GDPR page (https://help.mistral.ai/en/articles/347639-how-can-i-exercis...) and then to the erasure request section they just link to a "How can I delete my account?" page.
Unfortunately they don't provide information regarding their training sets (https://help.mistral.ai/en/articles/347390-does-mistral-ai-c...) but I think it's safe to assume it includes DataComp CommonPool.
wilg
GDPR has plenty of language related to reasonability, cost, feasibility, and technical state of the art that probably means LLM providers do not have to comply in the same way, say, a social platform might.
tonyhart7
so your best bet is open weight LLM then???
but its that a breach of GDPR???
pera
There is currently no effective method for unlearning information - specially not when you don't have access to the original training datasets (as is the case with open weight models), see:
Rethinking Machine Unlearning for Large Language Models
atoav
Only if it contains personal data you collected without explicit consent ("explicit" here means litrrally asking: "I want to use this data for that purpose, do you allow this? Y/N").
Also people who have given their consent before need to be able to revoke it at any point.
xxs
> need to be able to revoke it at any point.
They have to be able to ask how much (if) data is being used, and how.
tonyhart7
so EU basically locked itself from AI space????
idk but how can we do that with GDPR compliance etc???
itsalotoffun
I WISH this mattered. I wish data breaches actually carried consequences. I wish people cared about this. But people don't care. Right up until you're targeted for ID theft, fraud or whatever else. But by then the causality feels so diluted that it's "just one of those things" that happens randomly to good people, and there's "nothing you can do". Horseshit.
rypskar
We should also stop calling it ID theft. The identity is not stolen, the owner do still have it. Calling it ID theft is moving the responsibility from the one that a fraud is against (often banks or other large entities) to an innocent 3rd party
herbturbo
Yes tricking a bank into thinking you are one of their customers is not the same as assuming someone else’s identity.
messagebus
As always, Mitchell and Webb hit the nail precisely on the head.
JohnFen
> Calling it ID theft is moving the responsibility from the one that a fraud is against (often banks or other large entities)
The victim of ID theft is the person whose ID was stolen. The damage to banks or other large entities pales in comparison to the damage to those people.
rypskar
I did probably not formulate myself good enough. By calling it ID theft you are blaming the person the ID belongs to and that person have to prove they are innocent. By calling it by the correct words, bank fraud, the bank have to prove that the person the ID belongs to did it. No ID was stolen, it was only used by someone else to commit fraud. The banks don't have enough security to stop it because they have gotten away with calling it ID theft and putting the blame on the person the ID belongs to
laughingcurve
It’s not clear to me how this is a data breach at all. Did the researchers hack into some database and steal information? No?
Because afaik everything they collected was public web. So now researchers are being lambasted for having data in their sets that others released
That said, masking obvious numbers like SSN is low hanging fruit. Trying to obviate every piece of public information about a person that can identify them is insane.
jelvibe25
What's the right consequence in your opinion?
passwordoops
Criminal liability with a minimum 2 years served for executives and fines amounting to 110% of total global revenue to the company that allowed the breach would see cybersecurity taken a lot more seriously in a hurry
lifestyleguru
Would be nice to have executives finally responsible for something.
bearl
Internet commerce requires databases with pii that will be breached.
Who is to blame for internet commerce?
Our legislators. Maybe specifically we can blame Al Gore, the man who invented the internet. If we had put warning labels on the internet like we did with NWA and 2 live crew, Gore’s second best achievement, we wouldn’t be a failed democracy right now.
null
krageon
A stolen identity destroys the life of the victim, and there's going to be more than one. They (every single involved CEO) should have all of their assets seized, to be put in a fund that is used to provide free legal support to the victims. Then they should go to a low-security prison and have mandatory community service for the rest of their lives.
They probably can't be redeemed and we should recognise that, but that doesn't mean they can't spend the rest of their life being forced to be useful to society in a constructive way. Any sort of future offense (violence, theft, assault, anything really) should mean we give up on them. Then they should be humanely put down.
atoav
It doesn't now, but we could collectively decide to introduce consequences of the kind that deter anybody willing to try this again.
Lavia21
[dead]
imglorp
Reader mode works on this site.
busssard
[flagged]
Jana22
[flagged]
https://archive.is/k7DY3