California Attorney General issues consumer alert for 23andMe customers
321 comments
·March 22, 2025pmags
cj
> I never personally understood that
It’s a pretty simple cost/benefit equation.
For 90%+ of people, the benefit (or appeal) of seeing an ancestry report is greater than the cost (or risk) of handing over your DNA.
That said, it’s definitely fair to question why more people don’t take their personal privacy seriously. The reality is companies like Google (and 23andme) simply wouldn’t exist if everyone cared as much about privacy as the HN crowd. Google exists because consumers are fine with sharing their data, for better or worse.
theptip
I am extremely skeptical that many people are making an informed cost/benefit here. I would wager most users don’t even know about the license terms.
It’s the same as infosec in general. Most people don’t know about the risks, and anyway are bad at quantifying tail risk.
dkh
I am adopted. I spent most of my life having absolutely no idea whatsoever where I was from, or what biological risk factors I might have. 23andme was valuable to me on many levels, and even with the state the company currently finds itself in, it is not a decision I regret.
My wife also did 23andme some years ago, through which she discovered she had Factor V Leiden—a fact which became extremely important very soon after her discovering it, leading directly to changes in her treatment and how closely they monitored her for blood clots (she had a PFO and some other stuff going on that was already compounding her risk of clotting and stroke), and very possibly may have saved her life.
I’m supposed to go in and delete every trace of it out of fear of what the down-on-their-luck company might, or simply could, do?
While I know that my experience might be rare, I would regardless suggest that you reserve your skepticism, because you aren’t really in a position to assess who did or did not derive a justifiable amount of value from it or how informed of a human being they are
hn_throwaway_99
I agree most folks aren't aware of the risks. But I'm guessing for the vast majority of people that are aware of the risks, the thought process is basically along the lines of:
1. I'm simply not that important. There are millions of other people who have given this data to 23 and me and the like, and I'm just some rando peon - nobody is going to be specifically searching for my DNA.
2. The "worst case scenarios", e.g. getting health insurance denied because you have some gene, still seem implausible to me. Granted, there is a ton of stuff I thought would be implausible 5-10 years ago that is now happening, but something like this feels like it would be pushed back against from all sides of the political spectrum, even in our highly polarized world.
3. I haven't murdered anyone, so I'm not worried about getting caught up in a DNA dragnet. Sure, there can be false positives, but to get on in life you pretty much have to ignore events with low statistical probability (or otherwise nobody would even get in a car on the road, and that has a much higher statistical probability of doing you harm).
AYBABTME
I think most people do the cost benefit analysis in a much more empirical manner than your theoretical framework. Most everybody has a justifiable reinforced belief that trading data for value is worth it, since the vast majority of people don't feel like they've been on the losing side when they participated in these transactions before.
One can argue that these people may not have understood that a transaction was occurring. I would argue that this is beside the point. Their intuition is hard to discredit in the face of their lived experience. Aside from the marketing spam, most people are probably right in thinking that they've been better off with Google/<alternative> than without.
We can pontificate that people should know more about what they agreed to, and so on, if only they knew better, etc. But this rings hollow and very hypothetical to the vast majority of us. It's worrisome in thought exercises, but not validated in real life.
acdha
Ever make bug because you wrote code thinking about how you wanted it to work and forgot to consider how it could go wrong off of that happy path? I think things like this are basically the same problem: when someone is focused on the good outcome it’s just not the right context for most people to carefully evaluate possible negative events, especially low-probability ones. They’re thinking it’d be cool to get an ancestry report, maybe lifesaving to get notice of genetic problem, perhaps the excitement of a unknown relative, and unless there’s a neutral party involved the positives are probably going to win.
null
karparov
In my experience, even if people knew, they just don't care.
Most people I talk to about this, tech and non-tech folk have an attitude with a.mix of "you can't escape this anyway, so might as well embrace it" and "misuse scenarios you are describing are pretty far-fetched".
creato
You don't need to know the license terms to know what is happening. Just observe that you get ads based on browsing and searching behavior. Most people can see that it's happening and don't care. Or at least not enough to give up the value they get in exchange.
null
aucisson_masque
I believe it’s more that people don’t see the potential threat and harm into providing sensible data to commercial entities.
People who have been for instance wrongly jailed because Google gave their location history to law enforcement and they happened to be near a crime scene, these people they understand the value of privacy.
bonoboTP
People who were struck by lightning know not to go out of the house in rain. People who got hit by a falling brick on the sidewalk know to wear a hard hat when out and about.
These horror stories are so rare that the vast majority of people has never personally met anyone who personally knew anyone who had it happen. So it's all entirely theoretical and speculative for people who are busy and have social life goals. It's rightfully seen in similar light as the pepper hobby or extreme zero waste green philosophy. Worrying about it is basically a hobby, an identity, a community, an esthetic. Most people have some other hobby and identity and don't need this one.
loa_in_
Perceived cost and perceived risk. It's an important distinction.
treyd
> The reality is companies like Google (and 23andme) simply wouldn’t exist if everyone cared as much about privacy as the HN crowd. Google exists because consumers are fine with sharing their data, for better or worse.
This refrain is repeated endlessly but I've never heard a good argument as to why it must be this way and if it was any other way Google simply couldn't exist in any (ideally better) form.
whilenot-dev
Which Google product out of the many[0] do you know that doesn't scrape data?
I think Google profits massively off of the ignorance of its users and is reliant on their unawareness that they're producers of actual relevant data.
rchaud
Google the search engine could exist without being a privacy pest. But once Google bought DoubleClick in 2002 and made their ad delivery platform their own, going down the panopticon path became inevitable.
nolist_policy
Google makes about $3 per user per month in ad revenue[0]. With VAT and transfer costs that makes a equivalent subscription maybe $5 per month?
(Too much for my taste.)
Then you have to factor in that (far) fewer users are going to use/pay for their products if it's a subscription.
It's a though calculation for sure.
[0] https://thenextweb.com/news/heres-how-much-money-you-made-go...
smikhanov
It’s not about “the HN crowd”, it’s just that the time has changed so much. Do you remember 2008? Facebook was a swanky way back then to reconnect with your classmates and leave some “Like” things under their wedding photos. Google was seen as a way of organizing the world’s information to make it universally accessible and useful, with their niche ad service being still in limited beta. Twitter was so unknown that it still hasn’t even won a Webby Award.
And 23andMe was already offering a $100 DNA sampling in 2008.
It’s easy to be cynical about this in 2025. Those who didn’t live through the early 2000s, can’t even imagine the amount of optimism surrounding the tech industry at that time. Giving my DNA to a cool new Silicon Valley firm in 2008? Sure, why not, it was like buying a ticket to some utopian future.
cj
Your comment made me think of other companies we view as (mostly) “good” today.
E.g. tons of people take Uber/Lyft with no consideration about how those companies can take your travel history and daily schedule to monetize or sell to 3rd parties.
DoorDash - what if they start selling my order history to insurance companies as a variable to predict obesity/mortality?
HN/Reddit - what if some LLM scrapes all my comments, de-anonymizes me, and sells that info to a data broker?
Visa/Mastercard - what if 100% of my credit card history is sold to data brokers? (Spoiler, it already often is!)
Just trying to illustrate that even in 2025, we pick and choose what to view through rose colored glasses and what to frown upon.
In the example of Uber/Lyft, I willingly give up my home address and even let me know every time I’m out of town! (Trips to airport) yet that seemingly doesn’t cross my mind when requesting a ride.
I don’t disagree with your comment, but IMO what was true in 2000’s is still true today: people overlook risks of things when the benefits are substantial enough. That’s human nature
AlexandrB
I lived through the early 2000s and it was already pretty easy to see how 23andme could go wrong. Unlike data that could leak from a company like Google or Facebook, your DNA is forever associated with you and can't be changed or obfuscated. IIRC, many on HN made the same point at the time.
Elsewhere in this discussion there are people talking about how "the common man" doesn't understand the risks of privacy loss. Well it really doesn't help that when those risks materialize you also have people claiming "well no one could have seen that coming".
nine_k
(Nitpick: the "Like" button was invented by FriendFeed, which was acqui-hired by FB, and Facebook implemented the "Like" button in 2009.)
quantified
I remember 2008. It was the same thing as now. The majority of people just want what they want, useful or shiny, and don't care about the rest. Remember how many signed up for housing loans they couldn't pay back, knew they disn't have the money, but it was cool and everyone else was doing it? Same thing, roughly. Was the case then and will be now.
jasonfarnon
How can someone possibly make a cost/benefit analysis when the future uses of public dna data are so speculative? Criminal in the 70s didn't think leaving their dna around could lead to their arrest 30 years down the line, probably didn't factor in their cost/benefit analysis at all. I guess maybe you could figure there's safety in numbers, if loads of people are in the same boat are you there's a ceiling on your risk (legislation eg). Those of us who grew up in the era of smoking a pack a day don't really feel that way.
m463
But you have to acknowledge these companies started out as something different.
23andme started out as a democratized sequencing company
google started out as a search company. It became an identification and dossier-building company later.
or maybe I'm naive and they were data-grabs from the start.
loeg
It's a lot less than 90%. Seems like they've exhausted the TAM and there's no one else to sell tests to.
wenc
I made an informed decision when I signed up.
It's SNP genotyping, which realistically other than telling your ancestry and few health conditions, isn't that predictive of most health conditions. Genotyping only captures a small percentage of total genetic information (it's not a full sequence -- still too expensive for what I paid), and thus the data was actually very limited, so the risk was realistically very small.
Privacy is about risk-reward -- rather than applying the preventative principle to everything (which is overly conservative), we make trade offs in life.
whyenot
> I never personally understood that or why someone would want to turn over so much data to a commercial entity.
I found my half sister and biological father thanks to 23 and Me. Maybe cases like this will help you understand. Some people are willing to "pay" a lot to find out who they are.
null
nextos
Also, their genetic risk scores and population admixtures are really bad. I can't understand why they are so bad given that they hired pretty good researchers and building these is quite simple. Freely available models run circles around anything they report on their site.
It's a bit like uBiome, they have sold a lot of snake oil and harmed the reputation of B2C tests. It's a shame as something like 23andme, plus a bit of epigenetic testing to capture environmental factors, could be a wonderful way to get an overall health snapshot.
Glyptodon
Well... My experience (having worked as an SWE) w/ medical technology is that if a company is selling something, they will choose the version of advice or analysis that most aligns with selling something. (I got ordered to adjust scoring thresholds in a statistical grouping to have "nicer" groups, for example.)
If your company does treatment X that competes with treatment Y, they'll look to expenand the edge cases that suggest X over Y as much as possible. If a company wants people to feel like they're getting something out of a genetic profile, they'll report the broadest version of risk, and then slow roll more detailed analysis.
Things like this are why I strongly think certain profit motives and business models should be extremely restricted. Just like private prisons create a profit motive for creating crimes, medical services have a profit motive for spreading inaccurate and twisted medical advice, whether it's things like alternative or new age medicine, treatment modality choices, or DNA information.
WalterBright
Government funded research also has perverse incentives:
1. publish or perish, leading to lots of low quality papers
2. funding doesn't continue if one doesn't get results, leading to selection of "safe" research rather than risky research, and results that cannot be replicated
3. no funding for politically unpopular topics
and, of course, the reasons why people publish overtly fraudulent research papers.
robwwilliams
In the case of 23andMe users do own their data. And if they download their data and then request data be deleted then they are the sole owner. But if interested in genealogy, kinship, and some of the more actionable SNPs (e.g. those in drug ADME) then the 23andMe interfaces is informative and even useful medically. I have uncovered two medically useful variants. And some fun ones too: the speed with which I metabolize caffeine.
Their interface is also better than AllofUs.
ZeroTalent
They just did this:
"As an added security measure, we’re requiring that all customers choose a new password unique to 23andMe. To proceed, please reset your password."
I did my test over 10 years ago and lost access to that email.
NICE.
486sx33
Merry Christmas! Your dad isn’t your dad, your biological father is actually mommy’s old “friend” from work - Bill!
kadushka
I don’t get it – why should I care that they have my DNA info?
quickslowdown
You'll get it when your insurance company bought your profile and automatically added a ton of pre existing conditions to the list of things they won't cover for you. Or when the government decides to start rounding people up based on ancestry or health conditions.
It'll be way too late by then, but at least you'll get it.
BobaFloutist
For what it's worth, the ACA made it illegal to deny or charge more for health insurance for preexisting conditions.
I still largely agree that it's worth keeping your DNA out of the hands of data brokers for several reasons, but as laws exist today that would not be legal.
kadushka
I considered this when I sent them my sample. For me the benefit outweighed the risk. I'm sure their DNA database have been sold or given to all kind of companies and agencies since then (many years ago). Asking them to delete my record now is pointless. Anyone who cared about it already has it.
throw310822
It's funny because people are genuinely upset at others who do send their dna around; but assume that their own government might allow such a misuse of personal identifiable information. If your government allows that, the problem is the government, not the people who sent their dna to 23andme.
> Or when the government decides to start rounding people up based on ancestry
And you think that not having sent your dna to 23andme would be a good protection from that? Like, since you are among the smart ones who didn't send the dna, you can keep quietly minding your business while the government "rounds up" people on a racial basis?
poulsbohemian
I'm not really disagreeing with you, but it seems to me like the insurance company already has all those blood panels, etc that they paid for - they know what my issues are long before they get around to caring about what 23andMe might tell them. And this government will come after me for being a Democrat and my neighbors for being melanated long before they care about my DNA.
bobxmax
If any of that is happening, society is already a fascist dystopian hellhole and my 23andme DNA is least of my problems
guiambros
With all due respect, that's baseless FUD.
Call me naive, but I believe insurance companies have more concrete battles to fight (e.g. confirmed preexisting conditions), than getting in a legal quagmire with insurees just because "you have 2.7% risk of having early onset of X, instead of the 1.35% of the average population".
I understand the allure of painting everything with dystopian Orwellian colors and "but-what-if-they-do" thing, but the fact that 23andme is going bankrupt and others are not much better (Ancestry was acquired by a PE firm in 2020, and is increasingly pivoting away from DNA) is a great indicator that DNA data is commercially worthless.
If anything, the biggest risk is your data being used for ads. I bet some pharma companies would love to use your DNA data to enhance their 1PD propensity models.
relaxing
This makes a lot of sense if you subscribe to the libertarian view that corporations should have unchecked power and the governments is out to get you.
rchaud
I'd imagine you'd care at least a little bit considering you're paying them for the privilege.
noname120
If a company offered to pay you $119 for sending them sample cells from your body so that they can sequence your DNA and do whatever they want with it. Would you take up on the offer? I would not.
kadushka
I paid them a lot more than $119 to have my DNA sample.
brookst
I am also fortunate enough to just not care about $120. That’s not true of everyone though.
duiker101
"We have identified that you are at an increased risk of cancer. To ensure we give you the best care your insurance premium has now gone up 20x, you are welcome."
unyttigfjelltol
The scenario is not occurring, and anyway... if you had information about an increased risk, you can use that information to mitigate the risk and avoid the harm. Even if you think insurance and doctors are good for you, efficient and help you feel better... they are much more effective if you as the patient have a pretty good idea of what might be going wrong.
cluckindan
[flagged]
dragonwriter
> You will understand once DOGE starts ethnic cleansing
The administration's ethnic cleansing policy is already being executed, DOGE so far hasn't been particularly central to that aspect of the Administration's abuses.
Animats
The problem, not stated, is that a bankruptcy can wipe out the obligations of a company to its customers. This includes privacy obligations.[1] Especially if the assets are sold to a company outside California or outside the US.
[1] https://harvardlawreview.org/print/vol-138/data-privacy-in-b...
ajb
Yes. We need obligations to be able to follow personal data, by analogy with real estate (if you agree an obligation with your neighbour, for example access rights, it can be effected in such a way as to be binding against future owners. Otherwise you could get stuck without access each time they sell up. This is often set up at the point when the land is subdivided)
hypercube33
Personal data should be owned by us and a license given limiting it to the original company in its original form read entitlement only.
That's the only thing I can come up with to stop this and maybe have a side benefit of killing credit companies at 7am before I've had my cup of Joe.
huitzitziltzin
The fact that 23andme is at risk as a going concern tells you what you need to know about the potential of monetizing large amounts of generic data. It turns out you can’t get much value from it. If you could, they would have.
And no I don’t think all of that DNA data would be valuable to the likes of a large health insurer like Humana or Aetna either.
The medical records you are imagining an insurer can link to genetic data are worth even less than these DNA sequences turned out to be worth.
Sincerely,
A former health economist who has worked both with tens of millions of inpatient discharge records, and (separately) a detailed survey which is complemented by genetic data.
unyttigfjelltol
Candidly, given existing law in the US, the highest use an insurer could make of the data is to opt families into specialized preventative care using the DNA profiles in the database. They might make pretty decent profits taking that angle, and possibly generate significant goodwill.
s1artibartfast
How would that make them money? Every dollar saved in preventative care is $0.15 less profit, because insurers have a fixed profit margin as a percent of total care provided, due to the ACA's 85/15 rule .
The only reason to to do it would be to compete with a peer insurance company that is already doing it, resulting in less profit for both parties. The optimal strategy from the insurance profit perspective is to ban any DNA based cost saving measures for all insurance companies.
unyttigfjelltol
Setting aside for a moment whether this is feasible, an effective preventative care program would make the insurance program more competitive, in cost or quality. You're assuming employers and patients are just along for whatever anticompetitive ride insurers take them on. That would be a departure from marketplace fundamentals. The ACA also has safe harbors for process improvements .
On the feasibility point, if this use case is entirely infeasible I don't know what all the hand-wringing on this board is about!
huitzitziltzin
No.
(Slightly longer answer: I don’t have the funds to match what someone is likely to pay for this data, but if I bought it and gave it to you for free, and also gave you all the inpatient hospital discharge abstracts in the US in a matchable form, plus whatever health data you want, plus family relationships, plus a budget for 100 PhDs, product managers, and marketing people for 5 years you couldn’t turn it into a viable product. (Indeed that is literally what 23andme tried to do for years!))
Calvin02
Thank you for sharing.
I have long suspected that the sequencing data isn't valuable except to law enforcement.
If it were as easy to link sequencing to diseases, we would have seen a rapid advances in our ability to address those diseases. The genetic data alone isn't enough of a predictor.
steelframe
Whenever I start feeling smug about how cagey I've been about data brokers in the past, I remind myself that enough of my relatives have handed over their DNA to operations like 23andMe so as to render my efforts futile.
thomassmith65
Most likely, they're also sharing photos, videos and intimate details of you on social media.
beng-nl
And have corresponded by email to/from GP using a hosted email provider..
globular-toast
Yeah, and by not participating directly yourself you just earned a tag of "non-conformist", "weirdo" or perhaps "entity that has something to hide".
swyx
respectfully, privacy is important, but what exactly are the attack vectors if, say, I had your DNA? what happens?
1659447091
Health insurance first, is my guess. A way to discriminate; like car companies (GM) sharing data with insurance companies. But on a whole other level.
kristiandupont
Most people here seem concerned about insurance companies misusing personal data or a full-on totalitarian government takeover. However, my concern is about becoming susceptible to manipulation and coercion. A significant aspect the last election was the use of "Super PACs" like Elon's which targeted individuals on social media to influence their decision.
I think this trend will continue, not just in politics but across all sectors. The internet you experience will be tailored to your personality completely, but it will also be shaped to steer you in directions decided by who pay the most. The more data they collect about you, the more effective this manipulation will be.
This doesn't even account for the risks posed by malicious actors who might target you using this information.
1659447091
You don't need DNA for that though. Just hand out money and buy people.
Similar to November, a Musk funded group is currently offering $100 to registered voters in Wisconsin to sign a petition against “activist judges” with a not so subtle nudge to vote for the judge that will be favorable to the lawsuit he(musk) and Tesla are involved in within the state and who happens to be be part of the Trump gang having campaigned with junior.
grumple
The administration could decide to detain, deport, or kill everyone who has certain traits - say, Jewish, or Arab, or Mexican, or maybe just has undesirable traits. The db of millions of users makes this very easy.
The US admin is already at the stage of mass deportations. Detention camps (beyond those we already have at the border) are probably not far behind.
chii
None of those actions require the use of dna sequencing to happen.
xp84
If the US is 'at the stage of mass deportation' [of people here illegally in the first place] it's only because we've been wildly generous to that set of people for so long that there's a backlog of them. In most countries, doing crimes while there illegally is something you don't dare do, or deportation could be the least of your worries.
lelandfe
Did Facebook ever create a way for someone to delete the “shadow profiles” it builds for non-users?
I have a suspicion it will entail making an account.
r00fus
I hear the tried and true approach is to make an account and poison their well with fake but somewhat realistic data and don't close it.
dustyharddrive
Why sign their limitation of liability?
carimura
Sure you can delete your data, but guess what, they'll retain it anyways under "regulatory obligations". I've gone back and forth with their privacy team and this is the last response:
"This is a follow-up from the 23andMe Team. To clarify, we and our laboratory vendors are bound by various legal and regulatory obligations that may necessitate retention of certain information. We want to assure you that our data retention program adheres to applicable legal requirements which can vary depending on what country or state a customer lives in, the state a contracted laboratory is located in, and any applicable federal or state licensing obligations related to the ancestry and health products we sell. We can confirm that samples and genetic testing results are deleted in accordance with applicable law and any legal retention obligation serves as a proper exception related to a data deletion request under data privacy laws."
beacon294
It seems like you can sue them. This is purely legal's domain.
arjie
The practice of how this does damage isn't clear to me. But I'm going to test this in the very skin-in-the-game sense. My genome (sequenced by Nebula Genomics) is available to anyone who would like it. I have raw FASTQ files which you will have to pay a nominal fee to access.
Once upon a time, a friend and I decided we should launch a site where people can submit their genomes and health information so that broad population scale studies can be done. I did submit my stuff to All Of Us and so on, but I think the fact that you need to be special-cased to access the data is probably a loss.
So I think it's time to revisit this whole thing. Perhaps I should make VCFs available instead. They're much smaller and may be more accessible for people. In any case, if you want my FASTQs, just email me.
dekhn
Mine are here, free: https://my.pgp-hms.org/profile/hu80855C
(I had this done when I was launching Google Cloud Genomics so I had some data that I could work with without any restrictions. Illumina's genetic counselors told me "you have no genetic risk factors that we can detect" which is more or less what I expected (not that I don't have any- just that Illumina's genetic counselors weren't very good).
arjie
That's terrific. Here is mine https://my.pgp-hms.org/profile/hu81A8CC (created now in response to your comment)
FASTQs are much larger (being raw reads) so I'll leave them available via personal contact but this is a good place to host the VCFs. I'll answer the questions etc. as time goes by. Quite a few screenings have revealed a GJB2 variant in my genome, but I don't know if the Nebula sequencing was good enough to detect.
Also TIL about Google Cloud's stuff for this. Seems like it's been subsumed into a more general SKU for now https://cloud.google.com/life-sciences/docs/process-genomic-...
swyx
excellent, i was giong to ask if you're going to test it anyway why not make it free. and now you have. nice. i'm really curious what the attack vector is here. i try to take privacy guys seriously but sometimes theres definitely just fearmongering
vl
What service did you use/would recommend for sequencing?
Manfred
A government could decide they don't like a particular phenotype and decide to visit people based on a database. Something similar happened with the harmless "what's your religion" question on census forms in the late 1930's early 1940's.
BugsJustFindMe
The public already has enough information to substantially harm large groups if they want. A simple example being property ownership databases, which are often publicly available on the internet, can be referenced against culturally-suggestive first and last names to find the domestic whereabouts of large numbers of pick-your-group.
thomassmith65
That's just today. We might live to see targeted diseases.
My fear, in the current era, is to be included in countless virtual 'police lineups'. The higher the availability of my DNA, the higher the chance of a false positive affecting me.
pests
I read a book years ago that I can't remember the name that your comment reminded me of, although achieved a little differently.
In the book a global pandemic broke out but didn't effect Muslims. How was this possible? A in-world marketing campaign promoted "Mecca Water", in which the antidote / preventative was delivered and the Muslim world was made to believe it was holy / blessed / a pilgrimage experience to consume it.
I feel it might have been a product of the post-911 world but I did find it interesting and also fear the future as it seems possible today, I'm sure some are trying.
vl
I'm interested in sequencing my genome (I don't consider this data private - really any determined entity can collect it with just a bit of effort). How was your experience with Nebula Genomics?
If you would do it today, would your recommend them or somebody else?
robwwilliams
In the George Church crowd. Me too.
A high resolution image of a face contains as much or more functionally useful personal data than a vcf.
Hard to be optimistic about US trend lines now but I trust GINA to stay the law if the land.
https://en.wikipedia.org/wiki/Genetic_Information_Nondiscrim...
And if I am wrong then my DNA security is about the least of my/our problems.
asperous
No singular person, it's more the value of having a large database. You visit a coffee shop, a stalker collects your dna from a fingerprint and uses the a leaked or sold database from 23andme to tie it to your identity or home address, etc.
Interestingly this also works if a direct relative has used it as well.
lentil_soup
If a stalker already followed you into a coffee shop surely they have your name and address
themagician
The main risk is denial of insurance due to genetics. Insurance company buys database and uses it in the future to deny claims or terminate policies.
robwwilliams
Currently illegal in US.
https://en.wikipedia.org/wiki/Genetic_Information_Nondiscrim...
yborg
Insurance company sees you have a marker for some chronic illness or cancer or whatever and suddenly you can't get life insurance anywhere or have a massive premium. They could even deduce this if only your parents' DNA is available.
Current statute in the US only restricts using this data for health insurance as far as I know; and even if it's straight illegal, the playbook now is just break the rule of law and do whatever you want. I admire your altruism, but our society will not reward you for it.
robwwilliams
And life insurance.
ronnier
> The California-based company has publicly reported that it is in financial distress and stated in securities filings that there is substantial doubt about its ability to continue as a going concern
This is one reason I use signal over other texting apps -- I don't want my private messages sitting in a database waiting to be sold during a fire sale when the company goes under. Also why I try to locally host my apps such as security cameras, password manager, home automation, storage, wiki, among others
Glyptodon
What do you use for home hosting security cams, storage, and PW management? Does your storage solution work for automated phone data backups?
nijave
I use Home Assistant and Frigate for security cams. I have a rack mount server with Ubuntu that acts as a NAS with NFS for ipcam video and SFTP for SwiftBackup from my phone.
I don't host my own password manager but iirc you can self host Bitwarden (I use the hosted version). You can also setup Resilio or Syncthing to sync files from your phone like photos.
ronnier
WireGuard with a domain that only has private ip addresses. Caddy to handle domain certs. I use a split tunnel so my phone is always connected to my local network at home. Everything is http even with private ip addresses
I use frigate and home assistant. I have unraid for storage. I use a small x86 box with openwrt for my router.
I use vault warden (open source version of Bitwarden) for passwords. It’s amazing. And you can use the native Bitwarden client
Guvante
If 23andme has an agreement with its consumers on how it will handle the data it should not matter whether they are bought that agreement should be maintained in perpetuity unless those consumers actively choose to change their agreement.
After all we wouldn't talk about Dropbox being sold resulting in ransacking of your personal data why is that in the conversation with 23andme?
(I am not being critical of the AG here but instead pointing out how lax consumer protections have gotten that we even need to have this be a talking point)
JonathonW
> After all we wouldn't talk about Dropbox being sold resulting in ransacking of your personal data why is that in the conversation with 23andme?
Both 23andme and Dropbox's privacy policies only require them to notify users if the privacy policy changes (no restriction on scope of those changes), so maybe we should (if Dropbox were to be sold)?
Guvante
Not legally, they can only do that if you implicitly agree by continuing to use the product.
If you don't interact in a meaningful way you cannot change a contract from one side you need a new agreement.
Now whether this is enforced is a different manner.
karaterobot
You're right that it should not matter. That would be a great world to live in! It's not this one, though. Companies ignore these agreements all the time. Sometimes they're even caught and their wrists get slapped.
More often (I believe) we just never learn the agreements have been broken in the first place.
But it is a rule—almost approaching a law of nature—that companies facing financial distress will begin putting a price tag on private data they've promised never to sell. It's like the cartoon with the starving people in the life raft: they look at your data, and suddenly they don't see a legal agreement to protect it, they see a juicy drumstick.
> After all we wouldn't talk about Dropbox being sold resulting in ransacking of your personal data why is that in the conversation with 23andme?
Well, opinions differ on that one too!
No1
I have been wanting to get my genome sequenced for years, and had been thinking 23andme might be one of the better options because of the possibility of invoking the CCPA to get my data deleted after sequencing. Never did it because I wonder if they sell your info to some third party the second it comes off the sequencer, and also because I'm skeptical that they would fully comply with a deletion request.
For people who would like to get their DNA sequenced but are actually concerned about privacy, are there any better options?
EGreg
I guess it's just my programming instincts, but I just immediately think of the possible worst case scenarios and how strong are the guarantees they're prevented.
Dividing by X? I immediately think what if X is zero. Dereferencing X.Y ? I think what if X is null / nil. And so on.
So when it came to the DNA, I was hesitant to do it, since your DNA can wind up in all kinds of databases. And it turns out I was probably right.
What you could have done is sent in the information anonymously, or under a fictitious name. You can still use an email address and log in and see the results. Or you could use someone else's name from another country (with their permission), but then if that person ever gets in trouble, the DNA evidence might somehow implicate you (such as the guy with the last name NULL who got a lot of parking tickets LOL). A couple months ago I actually did submit with heritage.com and 23andme for a friend, so I think there was no place where you had to provide ID or something.
ekianjo
Until you can do it with a kit at home, probably none
jrm4
A simple rule.
When a company promises to never do a thing (e.g. be careless or sell off important data like this,) but there is no legal consequence or assurance, that company -- or some different company related to it -- is definitely, absolutely, going to do that thing.
teeray
This sucks the most for everyone that never consented to genetic data collection, but they have it all anyway. If you were the only holdout in your family to not use 23andme, it doesn’t really matter since they know a lot about you anyway. Genetic information is fundamentally shared among a group, so you shouldn’t really be able to consent to disclose it in a way that allows a company to do whatever it wants with it. They haven’t obtained all of the consent.
robwwilliams
I do not understand the purpose of this alert. There are no explicit warnings, just a premonition. The alert merely says what all users should know that their genetic and survey data can be deleted if they request it to be deleted.
That obligation to delete user data is persistent and will apply to any buyer of 23andMe. Or am I wrong?
What is the AG of California intimating that the data is now at risk of being released into the wild or worse? That is how some will respond to this alert.
What many customers may not know is that they can also download these valuable genotype data and store locally if they wish. Using these data is not easy, but it is possible with a but of research and help.
Those who have used 23andMe should and can expect the security of their data to be maintained by the company, and that obligation would apply to any purchaser.
I work in population genomics (non-human organisms), and myself participated in an early near-whole genome genotyping study back when microarrays were still the predominant technology (academic NOT commercial).
But for nearly 20 years I've been telling my extended family NOT to participate in any large scale genotyping with 23 and Me or similar commercial companies where they retain rights to your data, anticipating that something like the current scenario would likely play out.
Somehow, 23 and Me genotyping became the "gift du jour" for Xmas some years back -- I never personally understood that or why someone would want to turn over so much data to a commercial entity.
This is not to say that large scale sequence information is not appropriate for *some people*. But if that's something you need, make every effort to make sure you own your own data.