Harvard’s Quantum Computing Revolution: A Leap Towards Error Correction and Noise Reduction
Harvard University researchers in collaboration with QuEra Computing Inc., the University of Maryland, and the Massachusetts Institute of Technology (MIT) have made significant advances in quantum computing. The U.S. Defense Advanced Research Projects Agency (DARPA) has funded the development of a unique processor designed to overcome two of the biggest challenges in the field. Noise and mistakes.
Noise that affects qubits (quantum bits) and causes computational errors has been a major obstacle to quantum computing, which faces these problems. difficulty For quite a long time. This has proven to be a significant obstacle in the process of improving quantum computer technology. Since the beginning of time, quantum computers with more than 1,000 qubits have been needed to perform massive amounts of error correction. This is the problem that prevents these computers from being widely used.
In a groundbreaking study published in the peer-reviewed scientific journal Nature, a team led by Harvard University revealed a strategy to address these issues. They came up with the idea of logical qubits, a collection of qubits linked together by quantum entanglement for communication purposes. Unlike traditional error correction methods that rely on redundant copies of information, this technique leverages the inherent redundancy present in logical qubits.
The research team used 48 logical qubits, something that has never been achieved before, to effectively perform large-scale calculations on an error-correcting quantum computer. By demonstrating a code distance of 7, which indicates greater resilience to quantum errors, this could be achieved by constructing and entangling the largest logical qubit ever created. So we made it feasible.
To construct the processor, thousands of rubidium atoms were separated in a vacuum chamber and then cooled to temperatures very close to absolute zero using lasers and magnets. 280 of these atoms were converted into qubits and entangled with the help of additional lasers to create 48 logical qubits. Instead of using wires, these qubits communicated with each other using optical tweezers.
Compared to previous larger machines based on physical qubits, these new quantum computers have been shown to have a much lower rate of mistakes during calculations. The processor used by the Harvard team incorporates a post-processing error detection step instead of correcting mistakes that occur during calculations. In this step, invalid output is found and discarded. This is a rapid approach to scaling quantum computing beyond the Noisy Intermediate-Scale Quantum (NISQ) era currently in effect.
As a result of these achievements, new opportunities for quantum computing become possible. This achievement is a major step toward developing quantum computers that are scalable, fault-tolerant, and capable of solving traditionally intractable problems. Specifically, the research highlights the potential for quantum computers to perform computations and combinatorics that are unimaginable with techniques currently available in computer science. This opens up entirely new avenues for the advancement of quantum technology.
Image source: Shutterstock