Exclusive—IBM Shares Details of Its 400+ Qubit Quantum Processor
All About Circuits spoke with IBM physicist and chief quantum hardware architect Oliver Dial to get exclusive details on IBM's 400+ qubit quantum processor: Osprey.
At the IBM Quantum Summit in Nov. 2022, IBM announced Osprey, a 400+ qubit quantum processor. IBM aims to achieve quantum systems with 4,000+ qubits by 2025, unlocking supercomputing capabilities and tackling increasingly complex computational problems.
We spoke with Oliver Dial, physicist and chief quantum hardware architect at IBM, involved in developing the new 400+ qubit quantum processor.
The 433-qubit IBM Osprey chip. Image courtesy of Ryan Lavine/IBM
Dial has significant experience developing high-frequency electronics, cryogenic systems, and semiconductor spin qubits. At IBM, he specializes in superconducting qubits, researching their underlying physics and collecting system-level metrics.
A Quantum Processor with 400+ Qubits
IBM’s new quantum processor contains 433 qubits known as transmons, which are essentially superconducting resonators that can store 0 or 1 microwave photons. These qubits can be manipulated by applying microwave pulses of different frequencies to them from outside the processor.
“Our qubits are connected to each other with busses. Different qubits directly connected by busses have different frequencies, so we can control them independently,” Dial explained. “While transmons are a common qubit type, we use fixed-frequency transmons—meaning the frequency of microwaves we use to control them is determined when we make the device. We can’t tweak it during testing. This gives our devices great coherence times but puts a lot of emphasis on fabricating things accurately, so we can meet that frequency requirement.”
The researchers' device is supported by passive microwave circuitry, which does not deliberately absorb or emit microwave signals but redirects them. Examples of on-chip passive circuitry include microwave resonators that measure the state of the qubits, filters that protect the qubits from decaying out of a drive line, and transmission lines (in other words, wires) that deliver microwave signals to the qubits and to and from the readouts.
Dario Gil (IBM senior VP and director of research), Jay Gambetta (IBM fellow and VP of quantum computing), and Jerry Chow (IBM fellow and director of quantum infrastructure) presenting the 433-qubit IBM Osprey chip. Image courtesy of Ryan Lavine/IBM
“We build all this circuitry on chip with the qubits, using much of the same techniques as what’s called back-end-of-line wiring in traditional CMOS processes,” Dial said. “However, all these techniques must be modified to use superconducting metals.”
These multi-layered devices place qubits on a single chip, which is connected to a second chip known as the interposer through superconducting bonds. The interposer has readout resonators on its surface and multi-level wiring buried inside it, which delivers signals into and out of the devices.
IBM delivers its 433-qubit Osprey quantum processor. It has the largest qubit count of any IBM quantum processor, more than tripling the 127 qubits on the IBM Eagle processor unveiled in 2021. Image courtesy of Connie Zhou/IBM
This unique design creates a clear separation between qubits, readout resonators, and other circuitry, reducing microwave loss, which the qubits are very sensitive to. Ultimately, this is what allowed the researchers to pack so many qubits on a single chip to maintain good coherence.
“We developed this general structure in Eagle, a 127-qubit processor that we built last,” Dial said. “Eagle was the first integration of all these technologies, while Osprey proves that we can use them to make processors larger than anything we’ve made before. A lot of what’s new on Osprey isn’t what’s on the chip itself—which is a refinement of Eagle—but what surrounds it.”
A More Sophisticated Design
IBM's new quantum processor operates at very low temperatures of approximately 0.02 degrees Kelvin. The team thus had to identify a strategy to deliver hundreds of microwave signals into this low-temperature environment, considering the little cooling power of its processor’s refrigerator ( about 100 µW of power).
“The cables that deliver microwave signals to our processor are a particular problem, as most things that conduct electricity well also conduct heat and thus compromise the insulation of our refrigerator,” Dial explained. “To tackle this problem, our Eagle processor used over 600 cables going between different stages of the fridge, each assembled, wired, and tested by hand. In Osprey, we replace most of these cables with flexible ribbon cables created using standard printed circuit board techniques. Each one of these cables replaces many individual cables, connectors, and components—simplifying our design and thus increasing the processor’s reliability.”
The Osprey processor is supported by a new generation of control electronics, instruments outside of the refrigerator that create an interface between classical and quantum computing tools. These tools, which build on IBM’s previous work, generate microwave control signals for the new chip and interpret signals that come back.
IBM’s new processor has the potential to run complex quantum circuits beyond what any classical computer would ever be capable of. For reference, the number of classical bits that would be necessary to represent a state on the IBM Osprey processor exceeds the total number of atoms in the known universe. Image courtesy of Connie Zhou/IBM
“We achieved a new and simpler design for generating the analog signals based on direct digital synthesis and water cooled to increase the density of the electronics—letting us reach a whopping 400 qubits of control per rack,” Dial said.
The Osprey processor is based on a platform that was refined over several years, with technologies that IBM already tested and implemented on its Falcon, Hummingbird, and Eagle processors. The primary advancements from these previous processors are the wiring and control systems outside of the chip, as well as the scaled-up software stack.
“We’re also incorporating some learning into how we tune the device (i.e., its gate times, powers, etc.), which we think will make large sections of the device have much better average fidelities than what we’ve typically managed in the past,” Dial said. “We think this will make it an ideal platform for studying error mitigation—running multiple copies of a circuit with slight variations to generate more accurate expectation values.”
Approaching the Quantum-centric Supercomputing Era
The new processor created by Dial and his colleagues is another step toward the era of quantum-centric supercomputing (i.e., when quantum computers can solve arbitrarily-scaled problems).
“When we build a classical supercomputer, we don’t build a single fast processor, but we harness many processors working in parallel, which creates flexibility to solve one large problem, or many small problems at the same time,” Dial explained. “Similarly, we want to work toward a quantum architecture that can scale up and down, solving the parts of our users’ problems that are best solved on a quantum computer with a quantum computer, and solving the parts of their problems that are best solved on a classical with a classical computer.”
To allow users to harness the strengths of both quantum and classical computing technologies, IBM is working on a range of middleware and software tools that enable better communication between these different types of computing systems.
“We use the example of circuit knitting a lot when explaining this idea,” Dial said. “Our goal here is to take a single quantum circuit that is too large to run on a single quantum processor and break it up into smaller pieces that can be run on multiple processors. If all we have is classical communication between processors, we can do this, but the overhead (number of extra times we need to run the circuit) is large. If we expand that classical communication to include real-time classical communication (the ability to measure a qubit on one processor, turn it into classical data, move it to another processor, and change what we do on that second quantum processor all within a few microseconds), new advanced knitting options become possible. This richer communication allows better scaling, but now the computers need to be close enough to make this high-speed communication possible—distances of meters, not miles.”
Dial and his colleagues are now working on a new technology known as I-couplers, set to be unveiled by 2024, which could make the overhead vanish entirely. I-couplers are microwave links between quantum processors that can be cooled down to the devices’ milli-Kelvin temperatures so that they can be literally frozen into a system when the processor is cooled down.
“The final, very long-term project we’re working on in this area is called transduction: moving quantum information with optical photons instead of microwaves,” Dial added. “This would allow us to make reconfigurable quantum networks, but it’s a much more difficult technology to master. Nobody has fully demonstrated this in our systems.”
Other Advances and Future Outlooks
At the IBM Quantum Summit 2022, IBM also unveiled the Quantum System Two update, a platform that supports the operation of larger processors and the diverse types of communication that would characterize a quantum-centric supercomputer. Combined with its new processor and other tools, this platform paves the way for yet another year of exciting quantum technology advancements.
“There are things we are continually working to improve: our qubit coherence times, our gate fidelities, the density and crosstalk of our devices,” Dial said. “For the next year or two, we will also focus on two big hardware-centric projects. One involves various types of communication between quantum processors: real-time classical, chip-to-chip quantum gates (quantum multi-chip-modules), and long-range quantum communication—the basic ingredients for the quantum-centric supercomputer. The other is the introduction of cryo-CMOS control to our production systems.”
Currently, IBM’s control hardware is based on field-programmable gate arrays (FPGAs), which increases its cost and limits attainable qubit densities. The team hopes that moving to CMOS-based control components integrated into the refrigerator will simplify wiring and signal delivery problems in quantum computers, bringing them closer to their goal of developing a system with a few thousand qubits.
“As we talk about tens of thousands of qubits, error correction becomes more important,” Dial noted. “We believe we can get more efficient error correcting codes, but this will require more complicated connections between our qubits than those we have today. Right now, our heavy-hex devices (and most devices people make) have 2D arrays of qubits. Each qubit is connected to other nearby qubits on the surface of the chip in some repeating pattern. We are beginning investigations into creating connections between distant qubits on the chip and crossovers between those connections, which could pave the way toward machines that can implement efficient fault tolerant codes.”