By chasing errors, quantum computing is approaching a major turning point

Three research teams have proposed different methods to solve the problem of error correction in quantum computing, an essential step for this technology to be democratized.

Quantum computing is one of those technologies that seems set to revolutionize our daily lives. But it still suffers from major problems that still hinder the development of concrete applications, starting with the number of errors that can slip into the qbits. But three remarkable advances published on the same day could finally rid scientists of this constraint.

Very briefly, the power of this technology comes mainly from the fact that it makes it possible to store complex information in the form of subtle differences in the state of matter. the problem is that to ensure the integrity of this information, it is necessary to be able to perfectly protect the units used to store it – in this case, quantum bits, or qbits.

This is a process that we already master very well in traditional computing. There are many processes and even RAM memory (ECC) specifically dedicated to preventing and even correcting these errors. But quantum bits (qbits) are much more complex than those used by current equipment. It is therefore exponentially more complicated to preserve them perfectly.

This stability problem still hinders the development of quantum computing. Naturally, this is therefore one of the aspects of this discipline on which researchers are working hard. Three of them have just simultaneously published quite impressive results in this area.

Up to 99.95% accuracy for a single qbit

The first come from the team of Seigo Tarucha, a Japanese researcher whose team managed to achieve 99.84% accuracy for 1 qbit, and 99.51 for a system composed of 2 qbits. In the Netherlands, Lieven Vanderspyen’s team obtained comparable results, with 99.87% accuracy for 1 qbit and 99.65% for two qbits. But the most conclusive work is certainly that of the team of Andrea Morello, a researcher from the University of South Wales. He and his colleagues achieved 99.95% accuracy for 1qbit.

The tests on 1 qbit make it possible to evaluate the individual stability of a memory unit, absolutely fundamental for the operation of the machine. 2-qbit tests are used to test the stability of logic gates that serve as the basis for complex computers. In any case, these are significant advances, because we are now entering a field where cErrors become rare enough that we can try to correct them individually – absolutely essential safety for use in concrete operations.

When errors become so rare, it becomes possible to detect them and correct them when they occur”, says Andrea Morello. “This shows that it is possible to build quantum computers with sufficient power to perform meaningful operations.“, he says. “This paves the way for sets of qbits capable of producing robust and practically useful calculations.”, adds his colleague Serwan Asaad.

Obviously, replicating this performance on systems at the scale of an entire quantum computer will be far from easy. But researchers now have three strong studies on which to rely to achieve this. We could therefore soon see the first prototypes of (almost) error-free quantum computers appear, with all the consequences that this implies for our civilization.

The text of the study is available here.

Leave a Comment