Project Zero – Policy and Disclosure: 2025 Edition
32 comments
·July 29, 2025woodruffw
structural
There really isn't a great solution here. The notice that a vulnerability has been discovered puts even more pressure on the fix to be deployed as close to instantly as possible, throughout the entire supply chain.
Why is this? Especially for smaller or more stable open-source projects, the number of commits in a 90-day period that have the possibility to be security-relevant are likely to be quite low, perhaps as low as single digits. So the specific commit that fixes the reported security issue is highly likely to be identified immediately, and now there's a race to develop and use an exploit.
As one example, a stable project that's been the target of significant security hardening and analysis is the libpng decoder. Over the past 3 months (May 1 - Jul 29), its main branch has seen 41 total commits. Of those, at least 25 were non-code changes, involving documentation updates, release engineering activities, and build system / cross-platform support. If Project Zero had announced a vulnerability in this project on May 1 with a disclosure embargo of today, there would be at most 16 commits to inspect over 3 months to find the bug. That's not a lot of work for a dedicated team.
So now, do we delay publishing security fixes to public repos and try and maintain private infrastructure and testing for all of this? And then try and get a release made, propagated to multiple layers of downstream vendors, have them make releases, etc... all within a day or two? That's pretty hard, just organizationally. No great answers here.
Shank
On the contrary: If Project Zero finds a 0-day in a product I know I use, and I know that product is Internet Facing, I can immediately take action and firewall it off. It isn't always the case that they find things like this, but an early warning signal can be really beneficial.
For customers, it also gives them leverage to contact vendors and ask politely for news on the patch.
woodruffw
Maybe I don't understand the threat model here: what kind of public-facing services are you running that are simultaneously (1) not already access-limited, and (2) not load-bearing such that they need to be public-facing?
(And to be clear: I see the benefit here. But I'm talking principally about open source projects, not the vendors you're presumably paying.)
richardwhiuk
Some companies might be willing to compromise functionality to avoid compromise of their networks.
There's always a usability / functionality vs security tradeoff
saagarjha
Unfortunately I think most of the products you use have 0-days in them, it's just that Project Zero hasn't found them yet.
tptacek
I love it; it's a big-company reformulation of the classic vulnerability researcher's "reporting transparency" process: post "Found a nasty vuln in XYZ: 6f0c848159d46104fba17e02906f52aef460ee17d1962f5ea05d2478600fce8a" (the SHA2 hash of a report artifact confirming the vuln).
jms703
This seems like a good move. I do hope that slow moving consumers of the software in question can start anticipating an upcoming release and construct remediation plans instead of doing that after the release.
esnard
Related links:
- Vulnerability Disclosure FAQ ( https://googleprojectzero.blogspot.com/p/vulnerability-discl... )
- Reporting Transparency ( https://googleprojectzero.blogspot.com/p/reporting-transpare... )
croemer
> This data will make it easier for researchers and the public to track how long it takes for a fix to travel from the initial report, all the way to a user's device (which is especially important if the fix never arrives!)
This paragraph is very confusing: What data is meant by "this data"? If they mean the announcement of "there's something", isn't the timeline of disclosure made public already under current reporting policy once everything has been opened up?
In other words, the date of initial report is not new data? Sure the delay is reduced, but it's not new at all in contrast to what the paragraph suggests.
eyalitki
Not sure what is the measurable metric here, and what will be considered a success in this trial period.
Propagating the fix downstream depends on the release cycles of all downward vendors. Giving them a heads up will help planning, but I doubt it will significantly impact the patching timeline.
It is highly more likely that companies will get stressed that the public knows they have a vulnerability, while they are still working to fix it. The pressure from these companies will probably shut this policy change down.
Also, will this policy apply also to Google's own products?
zamadatix
The measure would probably be whether any of the reports led to examples of downstreams either syncing prior to release via security sharing they didn't already have established or any projects preparing to sync out of normal schedule ahead of time, regardless of if that's a small or large magnitude of change. How companies would prefer the public hear about a vulnerability has always been the lowest concern out of disclosures so I don't expect it to bring anything new here.
Google's products represent 3/6 of the initial vulnerabilities following this new reporting policy in the linked reporting page.
runningmike
It is indeed a complex problem. But is Google now killing FOSS slowly? IMHO there is far too much emphasis on Foss security and far too little on closed sourced hardware, firmware and software. Too much blame and pressure will not solve the complex problems as stated in the blog.
some_furry
Shoring up the security of FOSS is not "killing FOSS slowly".
Closed source software doesn't get to benefit from the goodwill of the open source software community, which includes independent security researchers as well as orgs like P0.
I guess our disagreement can be distilled down to one question:
Why would an emphasis on closed source products help FOSS, and why would an emphasis on FOSS help closed source?
Because this seems backwards to me. Maybe it makes sense in public relations where vibes are more important than substance and nobody thinks for more than 100 milliseconds?
mananaysiempre
It depends on the maintainer, some of them have indeed found themselves unwilling to continue their work in part because of Project Zero.
> I just stepped down as libxslt maintainer and it's unlikely that this project will ever be maintained again. It's even more unlikely with Google Project Zero, the best white-hat security researchers money can buy, breathing down the necks of volunteers.
tptacek
I know it's hard to believe this given the circumstances --- that maintainer has a very good reason for stepping back, absolutely no shade to give there --- but GPZ is doing a service for these projects. The vulnerabilities they find are there whether or not Google or anybody else steps up on the implementation side. They are simple facts of the software, and it's difficult, expensive, and important to uncover those facts.
riedel
It also seems to disclosesinteresting internal products: what is Google Bigwave ?
zamadatix
Seems to be described a bit here https://www.androidauthority.com/how-google-built-tensor-g5-...
bgwalter
I find the stated goal of alerting downstream a bit odd. Most downstreams scan upstream web pages for releases and automatically open an issue after a new release.
Project zero could also open a mailing list for trusted downstreams and publish the newly found announcements there.
The real goal seems to be to increase pressure on upstream, which in our modern times ranks lowest on the open source ladder: Below distributors, corporations, security pundits (some of whom do not write software themselves and have never been upstream for anything) and demanding users.
amiga386
If Google is adopting this, maybe rachelbythebay's vagueposting was ahead of the curve?
I jest; the vagueposting led to uninformed speculation, panic, reddit levels of baseless accusation, and harassment of the developers: https://news.ycombinator.com/item?id=43477057
I hope Google's experiment doesn't turn out the same.
fn-mote
> I jest; the vagueposting led to [...]
Resurrecting a 4 month old issue that evaporated in a day or two seems like poor form to me.
Also I believe most of the responsibility for the negative behavior should be assigned to those actually engaging in it, not the initial post. I understand others reasonably disagree (notably about the accusation and harrassment).
Tbh, it sounds like you might have been personally affected? At any rate, I certainly don't condone a mob mentality.
amiga386
I stand by what I said at the time: https://news.ycombinator.com/item?id=43492940 - and if you only read one thing, read the harrassment an atop contributor was subjected to by "eslerm": https://github.com/Atoptool/atop/issues/330#issuecomment-275...
I bring it up because of the unmissable parallels. Google are trialling a policy to see what will happen, but this incident shows already what can happen.
RbtB is a trusted blog by the HN crowd, and her vaguepost unexpectedly whipped up hysteria. It was only quelled by a post with more details the next day. Google Project Zero has enormous levels of trust, intends to vaguepost as policy, and not post more details the next day to satisfy the mob.
It does not look good for volunteer maintainers to suffer an entire world of talentless clowns rifling through every commit and asking "is this the bug Project Zero found?"
tptacek
The "Rachel By The Bay" blog and Google Project Zero are not reasonable comparands in matters of vulnerability disclosure.
diggan
> uninformed speculation, panic, reddit levels of baseless accusation, and harassment of the developers
To be fair, it seems like the only way of avoiding something like that is never saying anything publicly. The crowds of the internet eagerly jump into any drama, vague or not, and balloon it regardless.
eddythompson80
I remember when heartbleed was the big thing. There were many people digging into the person who committed the bug. Looking for where they lived, worked, and traveled. Many people were so desperate to find something to prove he was a spy.
If you google his name, 80% of the results are articles about how he denying doing that on purpose.
perching_aix
Speaking of, whatever came out of that? I don't see any related updates on that blog.
diggan
This was published the day after, with the title "Problems with the heap" but the URL makes the context clear: https://rachelbythebay.com/w/2025/03/26/atop/
perching_aix
Yeah, I meant since that one.
null
This policy change makes sense to me; I'm also sympathetic to the P0 team's struggle in getting vendors to take patching seriously.
At the same time, I think publicly sharing that some vulnerability was discovered can be valuable information to attackers, particularly in the context of disclosure on open source projects: it's been my experience that maintaining a completely hermetic embargo on an OSS component is extremely difficult, both because of the number of people involved and because fixing the vulnerability sometimes requires advance changes to other public components.
I'm not sure there's a great solution to this.