The Privacy Theater of Hashed PII
19 comments
·October 20, 2025blitzar
ozim
Yeah if you want to check if user is in someones else database you ask the user if the check can be performed. Then you will have the check already done if user doesn't agree even if he is in the other database it is not for you to make that check.
panstromek
Yea, this is pretty annoying and not the only problem in this field. There's a bunch of theather or misunderstanding in the marketing space. I feel like marketing people just don't get it. They seem to be hopelessly incapable to accepting that matching people in whatever way possible is the exact practice the laws like GDPR are trying to target. You cannot go around it by hashing, fingerprinting, ad ids, cookieless matching or whatever.
panstromek
I also think that many vendors in this space are abusing the fact that marketers are not technical people, so they just wave around some "we're GDPR ready", "anonymized data" slogans such that marketers feel that they can tick the "GDPR" box and get all the metrics they are used to.
While of course not realising that GDPR implementation is partially on them and that some of those metrics are literally impossible to implement without breaching into GDPR territory. Any company saying that they are "fully GDPR compliant" but also giving you retention and attribution metrics by default is probably confusing you in this way.
iamacyborg
They’re heavily incentivised to not get it, both internally with company KPI’s that’ve not kept pace with the reality of GDPR and externally through ad platforms that continue to demand excessive amounts of data without providing suitable alternatives.
panstromek
Yea, companies are probably abusing this (as I have noted in the sibling comment), but I think marketers themselves truly don't get it. I've been on the implementation side of this and it's always frustrating debate. It's pretty clear that they just think this is about just picking a different vendor with "GDPR" on list of features, not realizing that the law fundamentaly targets metrics they want to use, and they just cannot do it "the old way" as they are used to.
rdtsc
It is mostly performative. They do it so nobody can point fingers and accuse them of not doing it.
nevon
The company I work for has a similar, yet even worse instance of this. The employee satisfaction survey was advertised as anonymous, but when I looked into the implementation they were just hashing the email address, of which there were only a few thousand. A more conspiratorial mind would conclude that it is to easily be able to find who a particular piece of feedback came from, but in this case I legitimately think it's just incompetence and not being able to figure out a better way of ensuring each employee can only submit the survey once.
This year it's advertised as confidential, rather than anonymous, so I suppose that is an improvement.
rented_mule
Not calling it anonymous is an improvement. Before I retired, I read many "anonymous" surveys taken by my reports. Any free-form text in the survey that goes beyond a sentence fragment usually made it obvious who wrote it. At least in the case of my teams, writing styles tended to be pretty distinct, as were the things each person cared about enough to write at any length. I tried to ignore the clues, but it was usually so obvious that it jumped out at me. The people administering such things insisted that anonymous meant their name wasn't on it, so it was fair to call it that.
chii
A lot of people simply imagines that anonymity means un-identifiable. It's far from true, but i think some are honestly making the mistake, rather than being nefarious.
ozim
For me it seems like cracking hashes is irrelevant in grand scheme of things.
All the laws were passed so that companies don't not compare their customer lists without asking the customer first.
I hope some government agency picks that up and strikes such BS with might.
If you are BambooHR customer having people in your HR system - you have to ask person if you can check if they are up in BambooHR, guess what if they say no or yes you already have half of the job done.
Putting it into a hash and seeing if you have it in your database is still sharing that requires consent. Fuckers.
FooBarBizBazz
Isn't this solved with salt?
bob1029
This is how I did it. You generate a salt per logging context and combine with the base into a sha2 hash. The idea is that you ruin the ability to correlate PII across multiple instances in different isolated activities. For example, if John Doe opened a new account and then added a co-owner after the fact, it wouldn't be possible for my team to determine that it was the same person from the perspective of our logs.
This isn't perfect, but there hasn't been a single customer (bank) that pushed back against it yet.
Salting does mostly solve the problem from an information theory standpoint. Correlation analysis is a borderline paranoia thing if you are practicing reasonable hygiene elsewhere.
hlieberman
If it's salted, you can't share it with a third-party and determine who your customers in common are. (That's the point of the salt; to mean that my_hash(X) != your_hash(X)).
null
jstanley
> A 2020 MacBook Air can hash every North American phone number in four hours
If you added a salt, this would still allow you to reverse some particular hashed phone number in about 4 hours, it just wouldn't allow you to do all of them at the same time.
chrisandchris
A salt is very good if the input varies. If the input stays within a pre-defined range (e.g. phone numbers), salt does not work very well.
This is not for privacy. It is done for the sellers/buyers of PII, buyers do not want to buy data they already own and the seller doesn't want to disclose data before they sell it.
There is no honour amongst data thieves.