Quantum Tech Insider

After Heron: What IBM Quantum System Two Has to Prove Next

by Quantum Tech Insider Team

IBM's Heron processor did something important for the quantum sector: it made IBM's roadmap look less like marketing and more like an engineering program. When IBM and RIKEN brought a 156-qubit Heron-powered Quantum System Two online in Japan in June 2025, connected to the Fugaku supercomputer, the point was not just qubit count. It was proof that IBM wants the commercial future of quantum computing to be hybrid, modular, and tightly tied to high-performance computing.

So what comes after Heron? The short answer is that IBM now has to prove Quantum System Two can scale beyond a single excellent chip into a useful computing architecture.

If you want a clean primer before going deeper, Quantum Computing for Everyone is still one of the better plain-English starting points.

Why Heron mattered more than Condor

For a while, IBM's roadmap looked dominated by giant qubit-number milestones. Condor broke the 1,000-qubit barrier in late 2023, but it was Heron that reset the conversation. IBM's own positioning has emphasized quality, speed, and reduced cross-talk rather than raw scale. At the RIKEN launch, IBM said the latest Heron could reach a 100-qubit layered circuit error rate around 3x10^-3 and CLOPS around 250,000, both major improvements over Eagle-era hardware.

That matters because Quantum System Two is not supposed to be a museum case for one monster chip. It is supposed to be a data-center-style platform with modular cryogenics, classical runtime servers, and room to link multiple QPUs. In other words, Heron was the processor that made IBM's systems story believable.

For readers who want a slightly more technical but still accessible follow-up, Dancing with Qubits is a worthwhile companion.

The real next step is Nighthawk, not just "more qubits"

IBM's current public roadmap says the 2026 target is a Nighthawk processor configuration using up to three 120-qubit modules, for a combined 360 qubits, with the ability to run roughly 7,500 gates. That is the first number investors and researchers should watch after Heron.

Why? Because 360 mediocre qubits would not change much. But 360 qubits in a System Two environment, with better orchestration and tighter HPC coupling, could support more serious chemistry and materials workflows than today's isolated benchmark circuits. IBM has also been explicit that it wants to show near-term quantum advantage by the end of 2026, not in the old "supremacy" sense of a stunt calculation, but in scientific workflows where quantum processors handle the hardest subproblems inside a larger classical pipeline.

That is a much tougher claim. It means IBM must show the whole stack working: compiler, runtime, error mitigation, job scheduling, and data movement between QPU and supercomputer.

System Two is becoming an HPC product

This is the biggest shift that casual observers miss. The future of IBM Quantum is no longer "log into the cloud and run a circuit." IBM is turning Quantum System Two into infrastructure for national labs, pharma partnerships, and supercomputing centers.

Its March 12, 2026 reference architecture release made that explicit. IBM described quantum-centric supercomputing as coordinated workflows spanning CPUs, GPUs, storage, networking, and QPUs. That sounds abstract, but it has a practical implication: the winner may be the company that integrates quantum hardware into existing scientific computing environments first, not the company that posts the flashiest qubit number.

That is also why the RIKEN-Fugaku installation matters. If IBM can repeatedly show real chemistry or materials simulations where the quantum processor improves accuracy or cost inside an HPC workflow, System Two starts to look like a new class of research appliance rather than an experimental cloud endpoint.

The fault-tolerance milestone to watch

After Nighthawk, IBM's roadmap points to Kookaburra, described as a fault-tolerant module containing a logical processing unit and quantum memory. That is the milestone that separates "useful noisy systems" from the first genuinely scalable fault-tolerant architecture.

IBM's longer-range target remains Starling in 2029: about 200 logical qubits capable of running 100 million quantum gates. That number is far more meaningful than any headline about thousands of physical qubits, because logical qubits are what matter for running long, reliable algorithms.

If that sounds like a different language from today's hardware race, it is. Physical qubits win headlines. Logical qubits win commercial relevance.

Readers interested in the fault-tolerance side specifically may also like a quantum error correction book search, because that field is increasingly where the real competitive moat is being built.

What comes next for IBM, realistically

The honest answer is that IBM does not need to "win quantum" in 2026. It needs to make Quantum System Two routine enough that researchers stop treating it as a special event. That means more Heron-class or better systems, more co-located HPC deployments, cleaner software access through IBM Quantum Platform, and one or two application papers that serious scientists cannot dismiss as toy problems.

Heron was the credibility checkpoint. The next phase is operational proof: can IBM turn Quantum System Two into the standard chassis for quantum-centric supercomputing before fault tolerance fully arrives?

That is the real post-Heron question, and it is a better one than asking who has the biggest chip.