Building a scalable quantum computer starts with making the right decisions about the hardware architecture that will underpin the entire processor. The most consequential is the physical qubit type. A physical qubit is the lowest common denominator. It’s the quantum analog of a classical transistor. It can be a single atom, ion, or photon, or an electron, or even tiny amounts of electrical current sloshing back-and-forth through an ultra-cold circuit.
Physical qubits have error rates that are pretty high – too high to perform tasks that yield the kind of quantum utility we usually seek in the highest impact applications when we envisage disruptive quantum computing. For example, while classical computers have error rates of just 1 in 1 billion calculations, the best physical qubits have errors that are about 1 in 1 thousand or so. That’s just not good enough for useful computation.
That’s where error correction has typically been the next step – scale by crafting collections of physical qubits, called logical qubits, in just the right way to reduce the effects of errors and produce more reliable results from applications overall. But this is a big leap, requiring a lot of qubits and resources. Many diverse approaches have been devised, tested, and continue to be pursued with varying degrees of success. Many contend with the large hardware requirements to realize error correction at the levels required to get to quantum utility.
At Quantum Circuits, we’ve designed a qubit that opens a new avenue of exploration by bridging the gap between the conventional physical and logical qubit. It’s called the Dual-Rail Qubit, or DRQ. While in our systems it is also the “physical” qubit (it is the ground floor for our scaling approach), it is more capable than other qubits that comprise traditional quantum processors, for example, in the superconducting domain. DRQs are naturally high-reliability qubits. Given that they are superconducting, they are also fast. This is new to the industry and promotes fresh algorithm development by giving users the opportunity to run with higher fidelity and higher throughput at the same time.
The DRQ is built from several components. The actual quantum bit of information is stored in a photon that’s shared between two superconducting cavities. Think of a droplet of gasoline sloshing between two pistons in a car engine, except it’s a quantum droplet of light that doesn’t combust. The other components in the DRQ are quantum circuit elements, also akin to other engine elements, all working together to keep things running smoothly.
The bridge between the conventional physical qubit and error correction is that the DRQ can detect the primary error in the system, photon loss. It has a “check engine” light built into it that flashes when the droplet of quantum gas disappears and the engine misfires. In this sense, the DRQ acts like a logical quantum computing unit, able to perform error detection that is the cornerstone of our longer-term error correction and scaling approach as well as a rich set of features for users to explore new algorithms on upcoming systems.
The next natural question is why error detection is necessary or even interesting. Why not simply follow the conventional path toward error correction that’s well-trodden or continue devising near-term algorithms with existing devices and features?
In principle, all conventional strategies are still fair game. The caveat is that the engineering resources to get to working systems cannot be underestimated. Scaling paths with typical error correction approaches often require stellar qubit performance to get over a threshold where error correction is possible, and even then, the resource requirements to get logical qubits assembled can be daunting.
Near-term, the prospects of finding useful applications with current physical error rates are slim. Classical simulation is too powerful and can easily reproduce any quantum computer result today, so quantum utility is not in the cards. Furthermore, the best quantum processors are typically quite slow, so the most promising applications take a very long time to execute, introducing a new hurdle for users.
A new angle is needed.
This is where DRQs offer a real boost. Starting with error correction, the integration of error detection is a boon – it helps considerably with reducing what it takes to get error correction working. The scaling is also very favorable from the perspective of how much quantum hardware is required to suppress the error rate, making this architecture an excellent candidate for fault-tolerant quantum computing.
Next, DRQs offer the industry a new set of features based on error detection. This is something new to try, something new with which we can explore algorithms, even near-term without having to wait for error correction. It’s an exciting prospect.
At Quantum Circuits, we couple this with another powerful feature called advanced real-time control. Together with error detection, these two features fall under a new holistic tool we provide called error awareness, which you can read about in our blog post here.
DRQs are like a logical engine, unlike any qubit out there. They are revolutionary because they are more than just a typical physical qubit. They give rise to a new set of opportunities to explore new algorithms. We’re excited to share that with the world. We’ve launched our software and simulators on our own full-stack cloud platform, and our quantum systems are coming online soon. If you’d like to be one of the first to take it all for a spin, please reach out to us at partner@quantumcircuits.com.