Google today announced a demonstration of quantum error correction on its next generation of quantum processors, Sycamore. The iteration on Sycamore isn’t dramatic – it’s the same number of qubits, just with better performance. And quantum error correction isn’t really new – they had managed to get it working a few years ago.
Instead, the signs of progress are a little more subtle. In previous generations of processors, qubits were so error-prone that adding more of them to an error-correction scheme created problems greater than the gain in corrections. In this new iteration, it is possible to add more qubits and lower the error rate.
We can fix this
The functional unit of a quantum processor is a qubit, which is anything — an atom, an electron, a chunk of superconducting electronics — that can be used to store and manipulate a quantum state. The more qubits you have, the more powerful the machine. If you have access to several hundred, it is believed that you can perform calculations that would be difficult, if not impossible, on traditional computer hardware.
That is, assuming all qubits behave correctly. Which they generally don’t. So if you throw more qubits at a problem, you’re more likely to hit an error before a calculation can complete. So now we have quantum computers with more than 400 qubits, but trying to do a calculation that requires all 400 would fail.
Creating an error-corrected logical qubit is widely accepted as a solution to this problem. This creation process involves the distribution of a quantum state across a series of connected qubits. (In terms of computational logic, all of these hardware qubits can be addressed as a single unit, hence “logical qubit.”) Error correction is enabled by additional qubits adjacent to each member of the logical qubit. These can be measured to derive the state of each qubit that is part of the logical qubit.
Now, if one of the hardware qubits that is part of the logical qubit has an error, the fact that it contains only a fraction of the information of the logical qubit means that the quantum state is not destroyed. And measuring its neighbors will reveal the flaw and allow for a little bit of quantum manipulation to fix it.
The more hardware qubits you allocate to a logical qubit, the more robust it should be. There are currently only two problems. One is that we don’t have any hardware qubits left. Running a robust error correction scheme on the processors with the highest qubit counts would force us to use less than 10 qubits for a computation. The second problem is that the error rates of the hardware qubits are too high for all of this to work. Adding existing qubits to a logical qubit doesn’t make it more robust; it makes it more likely that there will be so many errors at once that they cannot be corrected.
This article was previously published on Source link