The Inevitable Shift to Qubit Efficiency & Error Correction Is Finally Happening.
This year we’ve seen a number of new and updated roadmaps released by some high-profile quantum hardware vendors. The targets are aggressive.
Recently, many providers have unveiled trajectories involving millions of physical qubits. Logical qubit numbers, which harness the challenges of error correction, range as high as tens of thousands. Logical error rates, which are the standard benchmark of QPU overall computational performance, are slated to fall dramatically, waving the green flag on commercial-grade applications. Companies that have not published new roadmaps are vocally doubling down on their existing plans, sticking to the core tenet that at least a million or more qubits are required to usher in the onset of quantum utility.
The history of quantum roadmaps clearly shows a qubit arms race, which in many cases obscures deeper, more pragmatic focus on core technology approaches that will maximize success in the long run.
One of the most important is error correction and the key facets of a given modality’s architecture to support it efficiently. Likewise, near-term capabilities of quantum systems are essential to support a robust ecosystem of application exploration. They foster novel algorithm discovery within the quantum community and industry at large, setting up the field to hit the ground running when the qubit numbers and error rates do reach the numbers required for quantum utility.
This arms race therefore merits a pause. We need to say what the market is unable to see and what vendors do not want to hear.
For years, high-volume qubit vendors have been running the wrong race.
They’re running an inefficient race requiring excessive “brute force” energy to produce massive qubit counts while compounding the issue by attempting to scale before solving error correction challenges.
In recent years, large slices of the industry focus on boiling down a QPU to a few technical statistics, which oversimplifies the message for a mass market that doesn’t know what it doesn’t know. Yes, the qubit number matters, as does the error rate. And of course, the logical qubit count is important too. But there are many more focal areas, metrics, and questions that enable a well-rounded judgment on effective quantum methodologies. Without them, information on qubit volumes creates misleading perceptions.
If it takes a football field and a dedicated powerplant to build just a single QPU, is that the best approach? Is it the most efficient method? Is it cost-efficient? What applications matter? How will a QPU be democratized and fit into an HPC ecosystem and reach developers worldwide?
Quantum is missing an opportunity to tell a bigger story. A more meaningful story. The whole story. Investors, the media, enterprises, governments – they all need to hear it. The majority of the quantum industry is missing another race that avoids the public inertia of qubit volumes. Quantum error correction is the critical race, and it’s essential to get it right.
Run the Right Race
Error correction is one of the capabilities a quantum computer requires to successfully execute algorithms at scale. Its purpose is to correct the errors that corrupt qubits. Given the special quantum properties of qubits, which make quantum computing so powerful, resource requirements for error correction have typically been high in most conventional techniques, making it difficult.
At Quantum Circuits, we have found a way to dramatically simplify things, leveraging a new, efficient approach.
The foundation of this narrative is the Dual-Rail Cavity Qubit (DRQ) technology. The approach is new, rapidly advancing, and demonstrating some of the best performance metrics across all hardware modalities in the industry. A multi-component, all superconducting unit, the DRQ harnesses the powerful properties of microwave-frequency resonators and transmons. These elements, arranged in a clever layout, enable high-fidelity quantum gates and measurements at high clock speeds – a unique combination in the industry.
DRQs also introduce a new capability that undermines the brute-forced qubit volume arms race – error detection that’s built in at the hardware level. Unlike other qubits, which are measured in just 0 and 1, DRQs return three results – 0, 1 and *. It’s this third result, *, that indicates a DRQ had an error during an algorithm. What’s unique is that users can learn, for the first time on a qubit-by-qubit basis, where the majority of errors occurred in the algorithm.
Conventional approaches across ions, atoms, and other superconductors experience “silent errors.” Algorithm results just get worse and only bulky error correction codes can improve the result. With built-in error detection, DRQs provide extra insight into error dynamics, bridging the gap from near-term systems to scalable error correction much more efficiently.
Error detection with DRQs is therefore more than just a mechanism to enhance fidelities for near-term quantum programs. It’s about winning the race to error correction more efficiently. The DRQ approach both reduces the number of qubits required to achieve low error rates and makes it easier to get error correction working to begin with. Conventional approaches do not have these advantages, and that’s what will hold them back.
DRQs keep things simple, saving capex and computational resources while maintaining the low error rates necessary for commercial-grade applications and maintaining efficiency to reduce the overall QPU footprint. Why build 1 million qubits to do something revolutionary if you can do it with 10X fewer, or better? No football field required. Less capex drain. Less resource demands. Longer cash burn rates. Better investment returns. A greater chance at achieving sustainable fault tolerance.
At the same time, Quantum Circuits is releasing new features along the way to offer our customers and partners a unique opportunity to explore uncharted spaces in quantum application development. Error detection is a fundamental requirement. DRQs give users access to previously inaccessible sources of high-quality data that no other system provides – error records at the single qubit level. These error records can be leveraged to discover novel applications across a host of high-value spaces.
More data means more classical horsepower is needed to help with the processing, both real-time and offline. That’s where the other two legs of the stool come in – GPUs and CPUs. The QPU is a remarkable force that will further accelerate today’s HPC techniques and maximize AI and quantum benefits. DRQs will be a key resource. More on that in a later blog post.
To be objective, we’re not the only ones with a hardware-efficient approach. Other hardware modalities like cat states are showing progress. They have offered roadmaps targeting low overheads between physical and logical qubit numbers and have shown impressive results on error correction. Some providers are targeting promising theoretical error correction codes that require overcoming high qubit connectivity hurdles at scale. While we root for the success of the quantum industry and look forward to how these methodologies evolve, we are championing the success of our dual-rail architecture and expect it to win out.
At Quantum Circuits, a key priority is deploying bigger and more capable systems that deliver larger logical qubit numbers that can attain lower logical error rates, but focusing on error correction as the bedrock of this strategy rather than joining the arms race we are witnessing. As we build upon our Aqumen Seeker product releases of 2024 and exciting results demonstrated in 2025, we look ahead to revealing a roadmap that articulates this narrative, focusing on system practicality, efficiency, and benefits for users.
We are building bigger and more capable systems but simplifying the challenge of scaling by introducing an innovative qubit architecture. Our pragmatic “correct first, then scale” approach sets the stage for a faster and efficient path to fault tolerance that enterprises will bet their businesses on.
As our CEO Ray Smets says often, there will be more than one winner in quantum. There will be multiple winners. And they will all be geared toward error detection, error correction, and efficiency. The brute-forced high-volume qubit arms race is the wrong race to run. It’s time to realize that qubit efficiency is the rabbit to chase.