Fujitsu and RIKEN develop world-leading 256-qubit sup quantum computer
33 comments
·April 22, 2025rtrgrd
staunton
Pretty much everything you read (especially when aimed at a non-expert audience) about quantum computing in the media is "easy-to-mediatize" information only.
People building these things are trying to oversell their achievements while carefully avoiding making them easy to check, reproduce, or objectively compare to others. It's hard to objectively evaluate even for people who work in the field but haven't worked on the exact technology platform reported on. Metrics are taylored to marketing goals, for example, IBM made up a performance metric called "quantum volume", only to basically stop using it when it seemed to no longer favour them.
That being said, it's also undeniable that quantum computing is making significant progress, error correction being a major milestone. What this ends up being actually used for, if anything, remains to be seen (I'm rather sure we'll find something).
packetlost
I worked on a quantum computer for several years and can speak to this a bit: sorta. They're functionally equivalent in the sense that you can do the same computations, usually, but there are a ton of details that make how each particular modality behave. Things like gate fidelities (how good the gates are), how fast the gates can "execute", how long it takes to initialize the quantum state so you can execute gates, how long decoherence times (how long before the quantum state is lost) are, and many (many) other differences. Some modalities even have restrictions on what qubits can interact with other qubits which will, among other things, impact algorithm design.
ziofill
Also the error correction strategies that are available to different architectures/platforms can make a huge difference in the future.
packetlost
Yup! Though error correction was not something I spent a lot of time on. I worked primarily on "quantum OS" (really, just AMO control systems) so wasn't thinking much on the theoretical side of things.
joshuaissac
No, they are not comparable. There are gate-model quantum computers like the one described in the article, and there are quantum annealers from companies like D-Wave that are geared toward solving a type of problem called QUBO: https://en.wikipedia.org/wiki/Quadratic_unconstrained_binary...
The latter has released quantum computers with thousands of qubits, but these qubits are not comparable with the physical qubits in a gate-model computer (and especially not with logical qubits from one).
k__
As far as I understand quantum error correction, the number of physical qubits might vary between systems with the same number of logical qubits. So, if you care about the overhead, they might not be equivalent in practice.
gaze
The idea is to implement something like the quantum equivalent of a Turing machine. What one universal quantum computer can do, another can. So yeah. However, connectivity and gate/measurement time will set some aspects of the performance but not the asymptotics.
JBits
It matters, but it's not functionally equivalent between different architectures.
Since noone has many qubits, typically physical qubits are compared as opposed to virtual qubits (the error corrected ones).
The other key figures of merit are the 1-qubit and 2-qubit gate fidelities (basically the success rates). The 2-qubit gate is typically more difficult and has a lower fidelity, so people often compare qubits by looking only at the 2-qubit gate fidelity. Every 9 added to the 2-qubit gate fidelity is expected to roughly decrease the ratio of physical to virtual qubits by an order of magnitude.
In architectures where qubits are fixed in place and can only talk to their nearest neighbours, moving information around requires swap gates which are made up of the elementary 1 and 2-qubit gates. Some architectures have mobile qubits and all-to-all connectivity, so their proponents hope to avoid swap gates, considerably reducing the number of required 2-qubit gates required to run an algorithm, thus resulting in less errors to deal with.
Some companies, particularly ones on younger architectures, but perhaps with much better gate fidelities, argue that their scheme is better by virtue of being more "scalable" (having more potential in future).
It is expected that in the future, the overall clock speed of the quantum computer will matter, as the circuits we ultimately want to run are expected to be massively long. Since we're far away from the point where this matters, clock speed is uncommonly brought up.
In general, different architectures have different advantages. With different proponents having different beliefs of what matters, it was once described to me as each architecture having their own religion.
TL;DR: the two key stats are number of qubits and 2-qubit gate fidelity.
lukan
So .. what can you actually do with that thing?
Are there any real world applications yet? Or is the real world application, quantum state experiments?
I think we are pretty far from using it as a general purpose computer or even special (disrupting) usecases like factorization. So who could use it with benefit?
staunton
There won't be "real world applications" for many years to come.
If I had to bet on what (impactful) application might come first, I'd guess simulation of chemical/physical properties used for drug development and materials science.
lukan
"Both organizations will integrate the 256-qubit superconducting quantum computer into its platform for hybrid quantum computing lineup and offer it to companies and research institutions"
But they offer it for rent. Who would be a buyer for the quantum part of the hybrid?
"Research institutions" but for what kind of research?
Or is this rather wishful thinking/PR "we bring quantum computing to the market (just nobody uses it)"?
staunton
> "Research institutions" but for what kind of research?
Quantum computing research. I'd guess a big chunk of revenue will come from universities and research institutes. Some companies might also pay for it, e.g. quantum computing startups in need of anything they can show before they have hardware, or startups that aren't even planning to build their own hardware.
There are people working on finding useful problems that these devices might help with and how to best make use of them, how to build "infrastructure" for it. It's useful for them to have something to play with. Also, many organizations want to be (seen as) at the forefront of quantum computing, know the current capabilities, strengths and weaknesses of the various platforms, train and educate people about quantum computing and quantum technology in general, etc.
mapt
Is there some minimum number of qubits at which some minimum viable quantum-supreme task can theoretically be achieved?
What would be required to factor a 1024 bit integer key?
tromp
You might as well ask what would be required to factor an 8 bit integer key. Because decades after the factorization of 21, we're still waiting for a quantum computer able to factor any product of two 4-bit primes with the general Shor algorithm.
ryao
> Is there some minimum number of qubits at which some minimum viable quantum-supreme task can theoretically be achieved?
That is a very broad range of possibilities, so allow me to narrow it to cryptography. I am by no means an expert on this, but I spent the weekend reading about quantum motivations to change the cryptographic algorithms society uses and as at as I can tell, nobody knows what the hard lower bound is for breaking hardness assumptions in classical cryptography. The best guess is that it is many orders of magnitude higher than what current machines can do.
We are so far from machines capable of this that it is unclear that a machine capable of it will be made in our lifetimes, despite optimism/fear/hope, depending on who you are, to the contrary.
> What would be required to factor a 1024 bit integer key?
I assume you mean a 1024-bit RSA key (if you mean a 1024-bit ECC key, I want to know what curve). You can crack 1024-bit RSA with 2051 logical qubits according to:
https://arxiv.org/abs/quant-ph/0205095
In order for it to actually work, it is believed that you will need between 1000 to 10000 physical qubits for every logical qubit, so it could take up to 20 million qubits.
Coincidentally, the following paper claims that cracking a 2048-bit RSA key can be done in 8 hours with 20 million physical qubits:
https://arxiv.org/abs/1905.09749
That sounds like it should not make sense given the previous upper estimate of 20 million physical qubits for a 1024-bit RSA key. As far as I can tell, there are different ways of implementing Shor’s algorithm and some ways use more qubits and some ways use less. The biggest factor in the number of physical qubits used is the error correction. If you can do better error correction, you can use fewer physical qubits.
upofadown
Factoring requires much better noise performance. So you don't have to consider the number of qubits yet for that particular application. A fundamental breakthrough is required.
There might be applications other than factoring that can be addressed with the noisy qubits we can actually create.
ryao
That was my thought. Somehow Ford managed to use one to improve its manufacturing process by having it solve a scheduling problem in 5 minutes that previously took an hour:
https://www.dwavequantum.com/company/newsroom/press-release/...
The news is light on technical details. Beyond that, I have no clue about useful applications.
griffzhowl
If anyone's looking for a clear and concise intro book, the best I've found is N. David Mermin's Quantum Computer Science. It's geared towards CS students, so very approachable, requires only familiarity with the basics of finite-dimensional complex vector spaces
liendolucas
Those pictures look like something just came down from the roof to be repaired.
To people who do quantum computing: are qubits (after error correction) functionally equivalent and hence directly comparable across quantum computers, and is it a useful objective measure to compare progress? Or is it more a easy-to-mediatise stat?