Looking at some claims that quantum computers won't work
36 comments
·January 18, 2025woodruffw
> Cloudflare, which hosts a considerable fraction of the Internet's web sites, reports that 33% of its connections are using post-quantum crypto as of January 2025.
DJB's narrative is a little selective here: Cloudflare has done some incredibly impressive things with post-quantum key agreement, which is arguably the "easy"[1] part of moving the Web PKI/TLS to a PQ setting. But key agreement doesn't tell the parties why they should trust each other; you need signatures and certificates for that, and those will need to be PQ-ready too.
That part is much harder, for both technical (larger certificates implied by most PQ signing schemes are much harder to reliably convey over packet networks) and political (the X.509 ecosystem moves very slowly, and penetration of new signature schemes takes years) reasons.
[1]: Nothing about it is easy.
Azerty9999
Are you indicating that Cloudflare's implementation isn't truly fully post quantum secure because of lagging certificate standards/technology?
Wanted to provide the source for your posting about 33% of Cloudflare TLS traffic having Post-Quantum Encryption as of Jan 2025 [1]
[1] https://radar.cloudflare.com/adoption-and-usage#post-quantum...
woodruffw
> Are you indicating that Cloudflare's implementation isn't truly fully post quantum secure because of lagging certificate standards/technology?
Yes, although I wouldn't say "truly" because they haven't intimated that it is. I'm not claiming any malfeasance on Cloudflare's part: they have been very explicit about the fact that the PQ components deployed so far are only in the key exchange. Bas Westerbaan has a great post on the Cloudflare blog about the state of PQ in 2024.
AlexCoventry
In the Scott Aaronson talk he links at the bottom[1], Prof. Aaronson says
> A fourth reason why people didn’t take QC seriously is that, a century after the discovery of QM, some people still harbor doubts about quantum mechanics itself... Or they say things like, “complex Hilbert space in 2^n dimensions is a nice mathematical formalism, but mathematical formalism is not reality”—the kind of thing you say when you want to doubt, but not take full intellectual responsibility for your doubts.
How is that a failure to take intellectual responsibility? (Asking because it's basically what I think[2], but I promise not to argue with any explanation given here. :-)
cvoss
One should investigate closely the connection between (well-supported) theories of physics and "reality", which, I gather, in this case means something like "the ontology of the universe".
Is the universe "actually made of" a flexible substance that moves according to the equations of GR? In one sense, it doesn't matter. It acts like it does, and so well that we can make very precise predictions about what will happen.
Suppose a naysayer in 2010 said "Well, GR is a nice mathematical formalism, but mathematical formalism isn't reality. It's preposterous to think space is actually made of this mysterious flexible substance. You'll never see gravitational waves. It's a fiction of the math."
The naysayer has conflated the claim "GR is ontologically true" with the claim "GR makes accurate predictions." The first is irrelevant, and may be freely denied without casting meaningful doubt on the second, which has been well-tested for a century. It would be a great surprise to conduct an experiment and learn that GR mispredicted the result.
QM predicts QC. To doubt QC is to doubt that QM accurately predicts experiments we can conduct. In this case, once again, a century of experiments cuts the other direction. The failure of QC would be the surprise, not its success.
AlexCoventry
So the GR guy would have been wrong, but how would he be failing to take intellectual responsibility?
ano-ther
I am not Scott Aaronson, but probably because that statement is not making any effort to analyze how Hilbert is connected to reality and how where why that connection breaks down.
jiggawatts
> complex Hilbert space in 2^n dimensions
A very simple argument is that there's strong reasons to believe that energy is required to represent all information in the physical universe. You can't have "states" without mass/energy storing that state somewhere.
2^n is clearly super-linear in 'n', so as you scale to many particles, the equations suggest that you'd need a ludicrously huge state space, which requires a matching amount of energy to store. Clearly, this is not what happens, increasing the mass/energy of a system 10x doesn't result in 2^10 = 1024x as much mass/energy. You get 10x, plus or minus a correction for binding energy, GR, or whatever.
Quantum Computing is firmly based on pretending that this isn't how it is, that somehow you can squeeze 2^n bits of information out of a system with 'n' parts to it.
The ever increasing difficulties with noise, etc... indicate that no, there's no free lunch here, no matter how long we stand in the queue with an empty tray.
ordu
> How is that a failure to take intellectual responsibility?
The argument is metaphysical, it doesn't pose any concrete questions about the validity of QM, which would go like "does QM predicts reality correctly". All the physics is built upon a nice mathematical formalism of real numbers, and it doesn't make physics invalid, it can be used to build planes and computers, despite those pesky real numbers, that are unreal.
Intellectual responsibility means that you filled your metaphysics with some substance. I don't really know, how to do it for quantum mechanics, but if I wanted to, I might start with the inability of QM to explain gravity, I'd dig this topic to the point when I would be able to propose a specific way how research on quantum gravity could overturn QM and make it wrong. Or at least I'd try to make an argument that QM predictions about QC might become false.
But I cannot make such arguments, because I don't know QM, and I'm not going to dive into it, because I know better ways to spend my time, so I'll keep my mouth shut and will not voice vague statements about QM being not reality.
In any case from metaphysical standpoint, I'm sure that physics is not reality, physics is a mathematical description of reality. It doesn't matter if this description is incomplete or even wrong if it works for our case. Like Newton's gravity still works, while being proved wrong. We just need to bear in mind the limits of applicability. So I see the argument Scott Aaronson discusses as a very general truth "a map is not territory" which is used in an incorrect way. The correct (intellectually responsible) way is to point to the limits of applicability and to build an argument that they can bite.
> Asking because it's basically what I think[2]
I had followed your link after I wrote the first part, and I was delighted to see that you targeted the limits of applicability. Your argument is different from just saying "mathematical formalism is not reality".
I cannot verify your thoughts, because I don't know QM or how QC are supposed to work, but you are talking about the limits of applicability: "a theoretically-capable Quantum Computer will be testing the predictions of Quantum Mechanics to a degree of precision hundreds of orders of magnitude greater than any physics experiment to date". It doesn't seem to me as intellectual irresponsibility, it is not intellectual irresponsibility if you really know what you a talking about and can defend your statement when talking with PhD in QM.
> but I promise not to argue with any explanation given here.
Feel free to argue, I have nothing against it. :)
BTW, do you don't take QC seriously because of that? Do you expect QM (as a theory) to fail as a result or R&D work that is done on QC? I agree that there is a probability of QM failing, but my uneducated guess that the probability is low enough (maybe 0.1?) to take QC seriously.
null
didgeoridoo
Isn’t that kind of like saying “complex numbers are a nice formalism, but only the reals actually exist”?
What does “reality” mean, in this context?
AlexCoventry
"Reality" in this case would be a cryptographically relevant Quantum Computer, as predicted by current Quantum Mechanical calculations which use these 2^n-dimensional Hilbert Spaces he mentions.
charlieyu1
Afaik we still haven’t factorised 35 using Shor’s Algorithm, I really don’t understand the current hype
didgeoridoo
Perhaps 35 is prime? We may never know.
Terr_
"Whelllp, it turns out that quantum primes aren't the same as classical primes, who'da thunk it?"
upofadown
There is some thought that we haven't factored anything yet via Shor's. The previous demonstrations assumed the results in their construction.
__turbobrew__
Yea that is a missing part from the argument. It is said that we should judge QC by the advances it has made and not if it can break the latest algos.
Well what is the progress then? How has QC advanced in the past 20 years, what is the latest QC can do?
adgjlsfhk1
Qbits and error rates have both improved by a few orders of magnitude. Google's paper from August is the first demonstration of error correction. We're probably ~5 years away from doing toy examples of Shor's algorithm with error correction, and ~20 from being able to factor numbers as big as a classical computer.
Azerty9999
Maybe it's marketing, but there seem to be some advances [1]
[1] https://blog.google/technology/research/google-willow-quantu...
jncfhnb
4 should be enough for anybody.
null
peepeepoopoo101
There's an awful lot of handwaving in this blog post. I'm sorry, but I'm not convinced. The author mentions how some devices that can seemingly solve exponential time complexity problems also require exponentially high precision, but there doesn't seem to be a strong argument for why that doesn't apply to quantum computers. We haven't experimentally demonstrated quantum computing at sufficient scales to prove that the required number of physical qubits to perform error correction doesn't scale exponentially.
plumthreads
I got the impression that DJB was criticizing the arguments for why quantum computers won't work. Not trying to demonstrate why they will work.
gjvc
yup
wzeng
We don’t need exponentially more physical qubits because we have quantum error correction schemes that exponentially decrease the logical error rate with only a polynomial increase in the number of qubits. There are in fact many schemes for this (https://errorcorrectionzoo.org/) with the surface code mentioned in the blog being a leading approach.
Details for how this could work for factoring are here: https://arxiv.org/abs/1905.09749
There will be engineering challenges to scale up these implementations but in principal you shouldn’t need exponential resources (unless there is something wrong with quantum mechanics). This sort of error correction scaling does not exist, for example, for analog computing.
rq1
The author is DJB.
sebmellen
For those not familiar: https://en.m.wikipedia.org/wiki/Daniel_J._Bernstein
gjvc
I think you've missed the point of it then.
Yoric
Note that you do not need error correction for quantum computing. You only need it for digital quantum computing. There's a separate branch, analog quantum computing, that is also very promising.
upofadown
>Everything disintegrates for physical error rates around 1% or above
Last I heard we were 1-2 orders of magnitude away from the error correction break even point for noise performance; that point where it would take an infinite number of noisy qubits to break 2048 bit RSA. So does this mean that we are still at an error rate of something like 10%?
wzeng
Several approaches are better than the break even point today, including the Google demonstration of error correction working to reduce logical errors: https://www.nature.com/articles/s41586-024-08449-y
There’s more citations to gate fidelity progress here: https://metriq.info/Task/38
upofadown
Did the Google experiment actually hit break even? I got that it had only demonstrated that surface codes did what they were predicted to do. Is it really only a matter of creating more hardware at this point?
Strilanc
Yes it exceeded break even, but no you can't just copy paste hardware yet. For example, some kind of chip-to-chip coupling is needed since chips can't be arbitrarily large.
johnea
Great to see DJB work posted here!
aaron695
[dead]
What about the Bitcoin argument? If CRQC is built, given that you only needs ~1000 logic qubits (and probably fewer gates than RSA) for ECDSA, you would expect big financial incentive to just use the compute a little bit to crack a few cold wallets. (After all, leaking classified information to the press is much easier to be caught than cracking wallets). And we will notice this breakthrough almost immediately if not within a few weeks.