Google makes a breakthrough in quantum computing: 30-year-old problem of quantum error correction is solved

Google has made a breakthrough in the field of quantum computing. Thanks to a new, improved processor and an improved error correction system, it was possible to significantly increase the lifetime of a quantum qubit. As Ars Technica reports, scientists managed to create the Willow quantum processor, which for the first time overcame the threshold of quantum error correction. This means that as the number of qubits increases, the error rate does not increase, but decreases. Additionally, on a fully powered 105-qubit processor, the logical quantum bit was found to be stable for an hour on average.

Image source: Google

To develop quantum technologies, Google built its own production center to create superconducting processors. “Previously, all Sycamore devices were made in a shared laboratory at the university, where graduate students and other researchers worked nearby on various experiments,” says Julian Kelly, a spokesman for the Google team. “However, we invested heavily in creating the new facility, hiring staff, equipping it and moving our processes there.”

The first result of the work of the new center was an increase in the number of qubits to 105 units on the Willow processor, which became the second generation of Google quantum processors. The new architecture of this processor has reduced the error rate by increasing the size of individual qubits, making them less sensitive to noise. This progress was confirmed in tests conducted using Google’s proprietary benchmark. “We found that completing a task on our new processor takes less than five minutes, whereas a classical computer would take a time commensurate with the age of the universe,” Kelly said. More precisely, Willow solved a problem from the RCS quantum benchmark in less than five minutes, which would have taken Frontier (the fastest supercomputer in the world) 10 septillion (1024) years.

A key aspect of the research was the behavior of logical qubits, which are the main element of quantum computing. They consist of multiple hardware qubits that work together to detect and correct errors. To run complex algorithms that take hours to complete, the stability of such qubits is critical, and Google’s new result confirms that improved error correction can provide the required level of reliability. Quantum error correction is a challenge that has confronted researchers for the past 30 years and has hampered the practical use of quantum computers.

For this, a special error correction code was used, which is a “surface code” (this code must also be error-resistant), which must fit perfectly into the square grid of qubits. Increasing the size of this mesh and using more and more of it improves the correction. The study found that moving from a distance of three to five to seven cut the number of errors in half at each stage. “We scale up the grid with this system, and the error rate drops by half at every step,” explained Michael Newman of Google.

However, qubits are still subject to rare glitches. One reason is local bursts of errors, another reason lies in a more complex phenomenon involving simultaneous errors in a region consisting of about 30 qubits. So far, only six such events have been recorded, so studying them is difficult, and Google emphasizes that “these events are so rare that it is difficult for us to collect enough statistics to analyze them.”

In addition to improving stability, increasing the size of error correction code can significantly enhance the effect of future hardware improvements. For example, Google calculated that improving the performance of hardware qubits by a factor of two at a Hamming code distance of d-15 would reduce logical qubit errors by 250 times. At a distance of d-27, the same improvement will reduce errors by more than 10,000 times.

However, complete elimination of errors is impossible. “It is important to understand that there will always be a certain level of error, but it can be reduced to a level where it becomes almost insignificant,” the company noted. Although further research is needed to increase the lifetime of logic qubits and scale the system, the Google team is confident in achieving its goals, and exponential improvements confirm the viability of the technology.

The results obtained open the way to the construction of quantum systems useful in practice. By the end of the decade, Google plans to create a full-fledged fault-tolerant quantum computer and begin providing quantum computing through the cloud.

admin

Share
Published by
admin

Recent Posts

Nissan Leaf EV to Become NACS-Ported Compact Crossover in Third Generation

Nissan Leaf can rightfully be considered a long-liver of the electric car market, since the…

5 days ago

OpenAI expects to more than triple its revenue this year and then double it next year.

OpenAI, the market leader in generative artificial intelligence systems, remains nominally a startup, its financial…

5 days ago

OpenAI Decides to Hold 4o Image Generation Launch for Free Users

OpenAI has been forced to delay the release of ChatGPT's built-in image generator for free…

5 days ago

1440p and 240Hz for just $200: Xiaomi updates the 27-inch Redmi G27Q gaming monitor

Xiaomi continues to update its Redmi G27Q gaming monitor every year. The model was first…

5 days ago

Beware, Android is shutting down: OS development will cease to be public, but there is no reason to panic

Android device makers can significantly customize the look and feel of the operating system, but…

5 days ago

Fake GeForce RTX 4090s with RTX 3090 chips have started popping up in China — craftsmen are even changing the GPU markings

In China, scammers have started selling GeForce RTX 3090 graphics cards, passing them off as…

5 days ago