A group of researchers funded by the United States Defense Advanced Research Projects Agency (DARPA) and led by scientists at Harvard — with help from QuEra Computing, the Massachusetts Institute of Technology, Princeton, the U.S. National Institute of Standards and Technology, and the University of Maryland — claim they’ve created a first-of-its-kind processor that would revolutionize the sector of quantum computing.
When business insiders discuss a future the place quantum computer systems are able to fixing issues that classical, binary computer systems can’t, they’re referring to one thing known as “quantum advantage.”
In order to realize this benefit, quantum computer systems must be secure sufficient to scale in measurement and functionality. By and enormous, quantum computing specialists consider the most important obstacle to scalability in quantum computing methods is noise.
The Harvard group’s analysis paper, titled “Logical quantum processor based on reconfigurable atom arrays,” describes a way by which quantum computing processes will be run with error-resistance and the power to beat noise.
Per the paper:
“These results herald the advent of early error-corrected quantum computation and chart a path toward large-scale logical processors.”
Insiders consult with the present state of quantum computing because the Noisy Intermediate-Scale Quantum (NISQ) period. This period is outlined by quantum computer systems with lower than 1,000 qubits (the quantum model of a pc bit) which might be, by and enormous, “noisy.”
Noisy qubits are an issue as a result of, on this case, it means they’re vulnerable to faults and errors.
The Harvard group is claiming to have reached “early error-corrected quantum computations” that overcome noise at world-first scales. Judging by their paper, they haven’t reached full error-correction but, nevertheless. At least not as most specialists would probably view it.
Errors and measurements
Quantum computing is tough as a result of, not like a classical pc bit, qubits principally lose their data once they’re measured. And the one technique to know whether or not a given bodily qubit has skilled an error in calculation is to measure it.
Full error correction would entail the event of a quantum system able to figuring out and correcting errors as they pop up in the course of the computational course of. So far, these methods have confirmed very arduous to scale.
What the Harvard group’s processor does, quite than right errors throughout calculations, is add a post-processing error-detection part whereby inaccurate outcomes are recognized and rejected.
This, in line with the analysis, offers a completely new and, maybe, accelerated pathway for scaling quantum computer systems past the NISQ period and into the realm of quantum benefit.
While the work is promising, a DARPA press launch indicated that at the least an order of magnitude larger than the 48 logical qubits used within the group’s experiments will probably be wanted to “clear up any huge issues envisioned for quantum computer systems.”
The researchers claim the methods they’ve developed needs to be scalable to quantum methods with over 10,000 qubits.