IBM touts error mitigation for greater quantum computing performance

Error mitigation for quantum computers could ultimately lead to more reliable and useful systems, according to IBM, which recently demonstrated how its error-handling technology enabled a quantum computer to outperform a classical supercomputing approach.

Quantum computing excels at solving large, data-heavy problems, and future applications are expected to significantly advance areas such as AI and machine-learning in industries including automotive, finance, and healthcare. But among the challenges developers face are the noisiness of today’s quantum systems and the errors they generate.

“Today’s quantum systems are inherently noisy and they produce a significant number of errors that hamper performance. This is due to the fragile nature of quantum bits or qubits and disturbances from their environment,” IBM stated in a release about its latest quantum developments.

IBM Quantum and University of California, Berkeley said this week they have developed techniques that show “noisy quantum computers will be able to provide value sooner than expected, all thanks to advances in IBM Quantum hardware and the development of new error mitigation methods,” researchers wrote in a white paper published in Nature this week.

“Errors are a natural thing to occur in a computer: the quantum state should evolve as prescribed by the quantum circuit that is executed. However, the actual quantum state and quantum bits might evolve differently, causing errors in the calculation, due to various unavoidable disturbances in the outside environment or in the hardware itself, disturbances which we call noise,” the researchers stated.

“But quantum bit errors are more complex than classical bit errors. Not only can the qubit’s zero or one value change, but qubits also come with a phase — kind of like a direction that they point. We need to find a way to handle both of these kinds of errors at each level of the system: by improving our control of the computational hardware itself, and by building redundancy into the hardware so that even if one or a few qubits error out, we can still retrieve an accurate value for our calculations.”

Most recently, IBM researchers shared an experiment that showed by mitigating errors, a quantum computer was able to outperform leading classical computing approaches. IBM used its 127-qubit Eagle quantum processor to generate large, entangled states that simulate the dynamics of spins in a model of material and accurately predict its properties.

A team of scientists at UC Berkeley performed the same simulations on classical supercomputers to verify the results from the IBM Quantum Eagle processor. As the scale and complexity of the model increased, the quantum computer continued to turn out accurate results with the help of advanced error mitigation techniques, even while the classical computing methods eventually faltered and did not match the IBM Quantum system, the researchers stated.

“This is the first time we have seen quantum computers accurately model a physical system in nature beyond leading classical approaches,” said Darío Gil, senior vice president and director of IBM Research, in a statement. “To us, this milestone is a significant step in proving that today’s quantum computers are capable, scientific tools that can be used to model problems that are extremely difficult – and perhaps impossible – for classical systems, signaling that we are now entering a new era of utility for quantum computing.”

The model of computation IBM used to explore this work is a core facet of many algorithms designed for near-term quantum devices. And the sheer size of the circuits — 127 qubits running 60 steps’ worth of quantum gates — are some of the longest, most complex run successfully, yet, the researchers stated.

“And with the confidence that our systems are beginning to provide utility beyond classical methods alone, we can begin transitioning our fleet of quantum computers into one consisting solely of processors with 127 qubits or more,” the researchers said.

And as a result of this work, IBM announced that its IBM Quantum systems running both in the cloud and on-site at partner locations will be powered by a minimum of 127 qubits, to be completed over the course of the next year.

“These processors provide access to computational power large enough to surpass classical methods for certain applications and will offer improved coherence times as well as lower error rates over previous IBM quantum systems,” the researchers stated. “Such capabilities can be combined with continuously advancing error mitigation techniques to enable IBM Quantum systems to meet a new threshold for the industry, which IBM has termed ‘utility-scale,’ a point at which quantum computers could serve as scientific tools to explore a new scale of problems that classical systems may never be able to solve.”

IBM continues to make progress on the quantum roadmap it laid out last fall. Among its long-term goals are the development of a 4,000+ qubit system built with clusters of quantum processors by 2025, and the development of software that can control quantum systems and network them together while eliminating errors.

At the IBM Quantum Summit 2022, the company said it was continuing development of a modular quantum platform called System Two that will combine multiple processors into a single system and use hybrid-cloud middleware to integrate quantum and classical workflows.

Next read this:

Source