Skip to content(if available)orjump to list(if available)

Against Transparency

Against Transparency

15 comments

·April 19, 2025

norseboar

I think the argument is interesting, but the specific example of prop 65 doesn't really work on a few levels. The argument in the post is that Prop 65's warnings are legitimate in some sense, but only apply in specific contexts.

However, Prop 65 is much broader than that. To qualify, a chemical just needs to show up on one of maybe half a dozen lists that show the chemical has some association w/ cancer, but all these show is that in some study, at some quantity, the association existed. The amount that was linked to cancer could be far beyond what is ever present in a consumer good, and the links could have only been shown in non-humans.

The lists aren't the ones gov't agencies like the FDA use to regulate product safety, they're lists far upstream of that that research institutions use to inform further study. The typical starting point is a mouse study with a huge dosage. It's not a useless study, but it's not meant to inform what a human should/should not consume, it's just the start of an investigation.

I don't think this actually has any bearing on the substance of the broader argument, but Prop 65 is not the best example.

1oooqooq

prop65 have the same level of coordinated opposition and information corruption as the food pyramid or cigarettes damage had for most of the time.

industry coluded to make it seems useless and industry spoon fed you the narrative you repeated. the list is very informative and meant to force the "invisible hand of the market" (its a pun, relax) to pay for better studies if they truly believe it is not harmful but studies are inconclusive. industry just decided to band and spend on making the signs useless.

norseboar

> the list is very informative and meant to force the "invisible hand of the market" (its a pun, relax) to pay for better studies if they truly believe it is not harmful but studies are inconclusive

To make sure I understand right: you're saying a good way to run things is: publish a list of a bunch of things that could be true or false, and then if industry cares enough, they should spend time/money debunking it?

I think that would be an extremely slow/conservative way to run just about anything, and is not the way we handle basically any claim. I can see an argument for "don't do something until you prove it's safe", useful in some very high-risk situations, but "warn that all kinds of commonplace things could cause cancer until somebody proves it doesn't" is misleading, not just conservative.

And it doesn't even work -- lots of places have spent time/money debunking e.g. negative claims about aspartame, but claims about how unsafe it is persist. And it all comes back to dosage. There is no good evidence that aspartame, at the levels found in a normal soda, cause any issues for humans, but this gets drowned out by studies either showing effects from massive doses on rodents, or indirect effects (e.g. it makes you hungrier, so if you eat more refined sugar as a result of that hunger, then yes it's bad for you, just like more refined sugar is almost always bad for you).

1oooqooq

you are still misguided that the list is utterly useless. i cannot open your eyes for you.

go for first hand experiences. you are still repeating others you don't know (and have been told told are authorities)

9283409232

I don't know if I would even call this clickbait but this is not an argument against transparency. It's an argument against poor regulations. I would argue Prop 65 is the opposite of transparency because just about everything causes cancer so people have learned to ignore the warning. It was a law that was passed in a time when we didn't have as much information as we do now and it should be updated and made more specific.

> You know what would be better than a privacy policy? A privacy law.

I agree but I wouldn't call privacy policies transparent. They are made of vague legal speak like "we may or may not share your information with advertisers and partners." There are good arguments in here but they are framed against the wrong target.

idle_zealot

The framing being used is that what we currently do is "pro-transparency." We make laws to "inform" consumers and then trust that the market will sort the rest out. Cory rejects this as a workable tactic, because transparency, even real, full transparency, just becomes noise that people filter out when making decisions. He argues that if you want good outcomes you need legislation other than forcing transparency.

yxhuvud

The flip way to argue that is that one way to get good legislation is that some level of transparency is in place so that people can make informed opinions on what is good.

9283409232

I don't think I disagree with the conclusion but my point is that we don't have real transparency and a lot of these transparency laws actually obscure information to confuse the consumer. So I guess the issue I'm taking here is that these laws he is attacking aren't real transparency.

jfengel

"Privacy policy: we don't collect or retain any data at all ever period."

You don't keep server logs? Cool and all, but it sounds like you'll have a hard time debugging if something ever goes wonky.

cardanome

If your server logs contain personal information then you are doing something horribly wrong and I hope you don't operate in the EU.

Don't log sensitive data. You don't need that for debugging.

lq9AJ8yrfs

But this is the same problem!

The GDPR and such define PII so broadly that more or less everything in web server logs is included in the definition.

Not sensitive PII, but still PII that the individual has rights and interests over.

That is more or less on purpose, and they do have a point.

Rogue debugging on the other hand is not what they are worried about vs using the data in web logs for targeting, profiling, etc.

If you could sell your web logs, would you? Vs how much would someone pay reddit or github for theirs? And would you be ok with that if your browse history was in there?

robin_reala

To be clear, the GDPR never uses the term Personally Identifying Information. It uses PD or Personal Data: this can be identifying on its own, but it’s more likely that some aggregate of multiple pieces of PD become identifying only when taken together.

ikiris

that's probably translated to the following is the problem: "Privacy policy: we're just gonna lie about it because our lawyers don't think there's consequences"

lgas

Or "we're just gonna lie about it because we don't think there's consequences so we didn't even ask our lawyers".

teddyh

No mention of the GDPR.