Sam Altman Wants Your Eyeball
137 comments
·May 10, 2025the_d3f4ult
jt2190
I presume that you would want your crypto gains to require “proof of identity” to access, not just “proof of humanity”, which is what World provides.
Also, once you have generated keys with your eyes, you use the keys, not your eyes, as tokens to validate your humanness.
sillyfluke
Thanks for this interesting point of view. However, it may makes sense to consider whether the gap you point to can be reduced significantly with AI, by training it separately with aging eye data using existing medical data independent from the general iris pool. I have no idea how realistic that would be personally.
beAbU
Are you willing to gatekeep you wealth behind an AI that might hallucinate that your iris has changed more than it actually has?
al_borland
I wondered this with things like FaceID. In my head, every time it unlocks it’s tweaking and tuning the what it knows to be the user, so it can adapt as a person ages, goes through weight changes, etc. However, in practice this may not be the case, since there is an easy backup and a person can re-register pretty easily.
On the topic of eyes, my dad recently had surgery on his eyes and they did one at a time, for obvious reasons. That could be a way to transition. Register both, have surgery on one, let it heal, register the healed eye, have surgery on the other, then register that. Always using the good and registered eye to authenticate. But this isn’t realistic. It requires way too much forethought and planning, when people’s minds are elsewhere.
caseyy
Meanwhile, in reality, banks have solved proof of humanity, identity (KYC), and financial services for a long time. Any account in the world can be proven human by making a debit/credit card transaction from a card with a matching name.
For the first time, you can now give your biometrics to OpenAI to do nothing more than you already can. This is just a pure cult of personality.
josh2600
I can only assume that you don’t work in financial technology if you believe that KYC is a solved problem.
Proving authenticity in an increasingly diasporic society is difficult.
We should seek to either reduce or embrace entropy in the design of our systems. You either want systems which prove there are no Sybil attacks, or you manufacture halls of mirrors.
This is a continuous battle, there’s no panacea here, even the eye scan has threat vectors.
Calling KYC a solved problem is ludicrous.
lokar
It depends what you mean by solved.
Banks tend not to over engineer things. KYC can be seen as a 3 sided trade off: cost of KYC infra/process/etc, lost revenue from denied business and fines from regulators.
They (the banks) don’t really care about the social goals of KYC, they just try to best optimize for expected value in the trade offs.
The regulators understand this, and are basically fine with it. They have their own trade offs they are balancing.
Both sides mostly find and equilibrium.
fc417fc802
Even considering the social goals there's no need for a 100% solution. We only need to stop most fraud and reduce the impact of the fraud that does happen to a sufficiently low level.
One of the more important goals isn't to directly stop fraud but instead to provide tools that give end users results that scale with the amount of effort invested. The level of risk should be a tradeoff that the end user is able to make.
Current solutions mostly allow for that but certainly have some rough edges.
wnevets
Even if you 100% believe in your heart that Altman would never do anything negative with these scans that doesn't mean someone else won't if/when they get access to them. People may trusted 23andMe with their genetic data but now no one knows who will end up owning it and why the buyer believes buying the data is profitable.
https://wydaily.com/latest/regional-national/2025/05/08/23an...
wing-_-nuts
When talking about government surveillance, people often ask some version of 'well, if you're not doing anything wrong, what are you afraid of?'. My response is always, 'It's not that I don't trust our current government, I don't trust all future governments that come after this one'.
I hope given the recent events of having some hired thugs rifle through government databases (including the OPM, which supposedly has very sensitive data from security clearance applications), that maybe letting people collect and store data on you should be avoided at all cost.
The older I get, the more I understand the stereotype of the eccentric former techie who no longer wants anything to do with modern technology or society
patrickmay
I see no reason to trust the current government, nor any of the previous ones in my lifetime.
napierzaza
[dead]
theyinwhy
Why would people distrust governments more than companies? I never understood that part.
fc417fc802
I distrust both, but the government is generally capable of much more significant threats to me. Notably it is the government which prevents companies from posing similar threats (at least currently, in the west).
wing-_-nuts
Sorry, I was only giving government surveillance as an example, I trust no one to protect my privacy, government or corporation.
dmitrygr
Because the government has more power over you. Thus, they deserve more scrutiny and suspicion.
andoando
I mean, I dont want the private details of my life exposed, illegal or not. Who does
dillydogg
I know, it's not as if I just let the cops walk through my house whenever they want. I just don't understand the "I've got nothing to hide" defense.
the_snooze
> 'well, if you're not doing anything wrong, what are you afraid of?'
I like to go with the simpler "I hold lots of sensitive data for people who trust me: my family, my friends, my employer. One would have to be a sociopath to disclose other people's secrets without their consent."
qwertox
Same happened with Komoot, a popular German outdoors app. They have millions of tracks and user profiles, it's also a bit of a social network.
From one day to the next they sold to the Italian "developer company" Bending Spoons.
That company acquired Brightcove a few months earlier and several other services like Evernote, Meetup, companies which have nothing to do with "outdooring".
From one day to the next they got access to my 13.000 tracked km, 800 hours of data. My profile is set to private, but now I have to assume that all this data will be sold to advertisers. "Anonymized".
sssilver
Every company is exactly one CEO away from doing anything none would ever expect.
usrnm
Most companies are exactly 0 CEOs away from that
Squeeeez
Not only companies sadly.
plastic3169
It is insane to defend Sam Altman here, but it looks to me that World goes out of their way to not to link the biometric data to identity or even save it. Sure you need to trust the black box but if the company gets taken over there is no data to for the new evil owners to access.
tim333
It's kind of different in that with 23andMe they know your name and genetic data. With Worldcoin you don't give your name and they just get a photo of your eyes.
wnevets
> they just get a photo of your eyes.
this comes across as if you are attempting to downplay the importance of this biometric data which is weird considering Altman is paying to get access to just a photo of your eyes.
ipaddr
So anyone who can scan your eyes can get into your bank account is very different compared to 23andme where you can give fake information as a name, buy a test with cash and they collect limited data that isn't worth much to advertisers. The first opens up many risks and surveillance opportunities that are different from the second where pooling of dna can link you to physical crimes or paternity confirmation where dna is found.
The first wants one data breach anywhere to last a lifetime and the second is a bigger media story because people believe what we can do with the dna 23andme collects is more than reality.
vizzah
no, there shouldn't be a photo, only a hash, which is useless by itself.
an0malous
I don’t think it’s the same, the eye scan isn’t linked to any other PII so all they know is “this is an eye scan of a human being.” What kind of abuse could be done with that data? I don’t think they even store the scan, they store a hash of it.
And on the other hand, I do wish there was a way to distinguish real humans from bots on the Internet. I think it’s only a matter of time until the web becomes useless thanks to AI. What’s a better solution?
amelius
I would trust Apple, though.
iamthejuan
They did that here in the Phillipines and they exploit the poor. They give money for people to allow them to scan their eyes. These people do not know the consequences of what they are doing.
ChadNauseam
What are the consequences?
andy_ppp
That you can be identified, cataloged and controlled, potentially. We have the technology to create heaven or hell depending on who controls it…
0cf8612b2e1e
Is that any different from reality now? At least they throw a few dollars at you for it.
I suspect that my face has been recorded and linked to my profile at several stores. Palantir or similar have probably scrapped all of the internet looking to link a face to an identity.
Real ID just because fully required for domestic air travel.
alwa
Is it certain that impoverished people would weigh those potential consequences more heavily than being paid today?
For that matter, do we expect that the impoverished people the gp commenter refers to would resist, say, government-led efforts to compel their biometrics from them? [0]
[0] e.g. https://www.csmonitor.com/World/Asia-South-Central/2022/0425...
BurningFrog
That train has already sailed. We can be identified and found in any number of ways already. Our kids will not imagine any other way.
As you say, the future can become heaven or hell.
Keyframe
not that it makes it any better, but you're saying that like it's impossible to do now what you're describing. you're already in the system, whether you like it or not, both public and private one(s).
koakuma-chan
Renders biometric auth pointless.
jbverschoor
The consequence is that you can have an actual fair global economy and better wealth distribution
dheera
> These people do not know the consequences of what they are doing.
They might have known the consequences, but money is money. I feel like for 99% of people there is a certain sum of money for which they will give into pretty much any kind of data collection. Even I'd give into it if they gave me enough. The bar is just higher, but there does exist a certain $X for which I would give in as well.
alwa
Most times I’ve been biometrically catalogued, I wasn’t paid for the privilege. The government just kind of said I had to, and I wasn’t in a position to argue.
I guess I am in some sense compensated by the data brokers who psychometrically profile my internet use and resell their conclusions—but “free ad-supported internet content” isn’t exactly fungible cash…
wellthisisgreat
How much would you want for your genome?
dheera
I haven't thought about an exact number, but I'd probably cave in for $20M.
I'd finally be able to afford a house, never have to work for a toxic company again in my life, and could afford various preventative medical care without relying on insurance.
Basically, "life-changing" money.
cube00
Hopefully everyone who handed over their DNA to 23andMe will remember how that ended before they hand over their iris to Sam.
echelon
They haven't seen any impact yet. Nobody is rejecting them from insurance or job offers yet.
It's worse with 23andme, too, because the blast radius is all of your relatives that didn't take the test at all.
water-data-dude
Aside from the issue of biometrics not being covered by the 5th amendment (so I won’t use them for login purposes), I’m hesitant to arrange incentives such that melon baller based crime is lucrative.
__MatrixMan__
Biometrics are not a viable solution to the sybil problem.
The more biometric tech converges on the ability to get a cryptographic hash of one's body, the further it retreats from the kind of thing that a layperson will trust. You end up with a root of trust that <1% of the population can verify and then you end up asking 100% of them to rely on systems built on that root. You're never going to be able to convince even a majority of people that some clever hacker hasn't cracked an iris scanner and associated millions of fake ID's with millions of AI's for scam purposes.
It needs to be the kind of thing that lets Alice assert that this key goes with Bob just after she shook Bob's hand in meatspace. Something where, in order for Bob to have two identities according to Alice, he'll have to meet her in meatspace twice and manage to have her not notice that she's already met him once before. PGP key signing parties were pretty much there, they just came too early (and not enough work was done to teach the masses about them).
The web becomes more of a dark forest with each passing day. Eventually the cost of maintaining your part of the trust graph will be lower than the cost of getting screwed by some root of trust that you can't influence or verify. I'm sad to say that I think the point where these lines cross is significantly down and to the right of where we are.
fc417fc802
> PGP key signing parties were pretty much there, they just came too early (and not enough work was done to teach the masses about them).
I won't dispute that PGP key signing parties coupled with government ID work very well for certain very specific usecases such as validating distro maintainers.
However for more mainstream and widespread uses that never occurred, what about work on the tooling? I've yet to see a web of trust implementation that really felt like it was properly generalized, scalable, and intuitive to interact with.
Case in point, if you wanted to implement a distributed code auditing solution on top of git and signed commits, what library would you use for the web of trust graph calculations? And would key signing parties be a usable root of trust for that with the current state of the software ecosystem? My personal view is that both of those things are woefully lacking.
tim333
I did the worldcoin scan thing a couple of years ago and it's all quite jolly. The article is a bit scaremongering. Re:
>Simply put, the premise is this: scan your eyeball, get a biometric tag, verify yourself, buy our apps (and cryptocurrency). ... Minority Report style technology
it's not really like that. They take a photo of your eye to check you are a new person and not someone who has an account already, then give you an account which is like an anonymous crypto wallet with a private key. You never do an eye scan again in normal use. They give you free crypto/money rather than you needing to buy anything. I've been given ~$300 - it fluctuates a fair bit with crypto prices.
I recommend it to anyone who's curious / positive about new tech.
qwertox
> I recommend it to anyone who's curious / positive about new tech.
What does this have to do with curiosity or new-tech positivity? Nothing.
Give biometric data, get fluctuating ~$300. You did nothing else than sell something you have. I'm not judging.
A4ET8a8uTh0_v2
<< I recommend it to anyone who's curious / positive about new tech.
I would ask that you elaborate a little more. I am an example of one. I like LLMs, but I cringe internally and externally at times at some of the things people seem to want to use them for ( and I just saw a presentation that basically said the equivalent of "add AI here, happy sunshine leaves there". And how? Magic. Nobody knows. ). I like crypto, but it is impossible to not see million rugpulls, scams and so on out there. I like technology, but I am very, very aware of the issues with basic human nature.
voytec
Biometric data is valuable. Assuming that Musk's DOGE crew copied data obtained from US gov agencies, they may have also obtained biometric data of EU citizens. At least some countries have shared citizens' fingerprints with the US. Not just Visa Waiver Program applicants' but as I understand, previous government of Poland made a deal to share all citizens' fingerprints. And these are collected from anyone renewing their government ID card.
mzajc
> previous government of Poland made a deal to share all citizens' fingerprints
This piqued my interest, but I couldn't find anything. Do you know where I could find more information about it?
voytec
Sadly, I'm unable to find specific documents confirming it. AFAIK, Poland agreed to biometric data sharing with the US Office of Biometric Identity Management in exchange for loosening travel requirements. That said, US seems to be pushing[0] for more such agreements.
[0] https://www.statewatch.org/news/2023/august/eu-and-usa-ploug...
Waterluvian
You got me thinking and I’m not sure my fingerprints or any other biometric data have ever formally been recorded by the federal, provincial, or local governments.
In most cases the most biometric data is your photo and I guess height?
So beginning to normalize the collection of eyeball data as a thing is a pretty significant escalation.
n_ary
In EU, obtaining an identity card needs one’s recent biometric(important!) photo and certain finger prints. Not sure about US.
Also where I work, to enter certain facilities, I also need to not only scan my badge, but also my fingerprint or sometimes palm(may sound absurd but I am sure some of you work in same sector).
Geee
This whole idea doesn't make any sense. Someone with a world ID could still be running AI agents on their behalf with their private key, or use stolen / bought keys from other people. On the other hand, Sam Altman could be running millions of fake personas, because they can generate keys from thin air. Also, Sam Altman would have the power to invalidate your keys, or the keys of people he doesn't like. It would be an absolute catastrophe if this system was used for voting or something important.
mapcars
Something I don't understand is how is that so bad? Even today one can buy passport data, social security numbers etc on black markets leaked by government employees in most countries. Once they start using more biometric data I'm sure it will be leaked as well.
If we assume that all this information is permanently available in a public blockchain, how does it change anything for society really? I can think of security checks becoming better, what are the negative possibilities?
creata
Justifying what Worldcoin is doing by comparing them to black market leaks isn't helping their case.
mapcars
I'm not justifying Worldcoin and I don't know all the details about what they are doing.
My realistic assumption is that all this data will become public one way or another, so I'm trying to understand how we can make sure it can't be abused.
ethbr1
Why would anyone want anything to do with Sam Altman to have control of it though?
At least if it's open, access is equal.
udev4096
He can chortle my (eye)balls!
moffkalast
Anton doesn't call me anything. He grimly does his work, then he sits motionless until it's time to work again. 4o could take a page from his book.
I'm an ophthalmologist. I look at irises all day. People's irises change over the course of their life. Sometimes dramatically if they have some kind of pathology. Are they updating their model periodically? What keeps someone from getting locked out of their crypto gains if they develop an iris nevus or have cataract surgery or start on flomax?