Quantum computing promises to solve problems beyond the capabilities of traditional systems, with potential breakthroughs in pharmaceuticals, climate modeling, logistics, and financial optimization. Yet the main challenge holding back large-scale deployment lies within the instability caused by quantum noise.
These disruptions interfere with qubits, leading to unreliable outputs and lost information. To solve this, researchers and companies are focusing on making quantum computers more resilient to noise, a property known as quantum robustness.
Achieving quantum robustness requires combining multiple methods that work together to counteract the effects of noise and prevent errors from derailing calculations. These include error suppression, which manages noise before computations begin; error detection and correction, which operates in real time to stabilize the system; and error mitigation, which cleans up errors after the fact using software tools.
By developing hardware and software that support these techniques in concert, the quantum industry is taking important steps toward building fault-tolerant quantum systems capable of solving valuable problems.
Why It Matters: Without the ability to detect and correct errors, quantum computers will remain limited to experimental tasks. Making them reliable enough for industry and science requires major improvements in how errors are handled. This progress could unlock trillions in economic impact over the next decade.
- Suppression as a First Line of Defense: Error suppression reduces the likelihood of failure during computation by improving qubit stability before they are used. This preprocessing technique modifies control signals to limit the effects of noise. It does not increase runtime or require more hardware, but it cannot eliminate all types of errors. As a result, suppression is seen as a base layer in a larger strategy rather than a standalone solution.
- Detection and Correction in Real Time: These methods use groups of physical qubits to encode more reliable logical qubits. Schemes like surface codes, repetition codes, and qLDPC codes offer different balances between efficiency and resource use. Detection happens through stabilizer measurements, which identify errors during operations. While powerful, these approaches require large numbers of helper qubits, which add significant resource and engineering demands.
- Postprocessing with Mitigation: Error mitigation improves result quality after computation by relying on statistical analysis and multiple circuit executions. It is especially useful when full error correction is not yet practical. Though it doesn’t prevent errors directly, it enhances output reliability. This method increases processing time and costs due to repeated circuit runs but enables usable results from less stable quantum machines.
- Integrating the Full Stack: Building quantum robustness involves aligning error-handling techniques across the entire system. Companies are now working to ensure compatibility between correction techniques and processor architectures. This includes collaborations between hardware developers and software teams to deliver coordinated solutions capable of managing errors at all levels of the stack.
- Evaluating Correction Techniques: Quantum developers compare correction methods using six criteria: modality compatibility, architectural support, qubit overhead, code distance, scalability, and recent advancements. These benchmarks help teams choose appropriate tools based on their platform and development goals, enabling a practical path toward fault tolerance.
Go Deeper -> Making fault-tolerant quantum computers a reality – McKinsey & Company
Trusted insights for technology leaders
Our readers are CIOs, CTOs, and senior IT executives who rely on The National CIO Review for smart, curated takes on the trends shaping the enterprise, from GenAI to cybersecurity and beyond.
Subscribe to our 4x a week newsletter to keep up with the insights that matter.


