Is quantum having its Late Triassic moment?
The field can’t agree whether we’re about to enter quantum’s golden age or a much longer transition period.
Like palaeontologists, quantum scientists like to think in eras. As far as the evolution of quantum computers is concerned, we are currently in the Noisy Intermediate-Scale Quantum (NISQ) phase. Like the dinosaurs of the Triassic, quantum machines represent the highest evolutionary state of computing – albeit one that exists at the margins of a field dominated by their slower-witted and lumbering classical counterparts.
The quantum computers of the NISQ era are also extremely fragile, flighty machines that are extremely vulnerable to error and noise. The consensus among most quantum computing researchers is that this era will eventually, and very suddenly, give way to the post-NISQ epoch of error-free machines capable of harnessing thousands or even millions of logical qubits.
The reality will be messier than that. Just as the dinosaurs (presumably) did not fire a starting gun announcing the commencement of the Jurassic, so too will the post-NISQ era not magically begin overnight. It’s more logical to think that we’ll have a longer bridging period, or a “mid-stage NISQ era” as Quantinuum’s founder Ilyas Khan recently put it to me, before we begin constructing million-qubit machines: one where the first logical qubits will tentatively emerge from the quantum soup and allow scientists to make the first calculations beyond the ken of classical machines. What’s more, argues Khan, this Late Triassic is likely to emerge as soon as next year – heralded, he adds, by recent breakthroughs engineered on his firm’s 32-qubit H2 machine.
Noise problems
But first let’s go back to NISQ and what it really means. Why do we talk about quantum computers as having errors and noise in a way we don’t about GPUs or CPUs? The simple answer is that the quantum world is weird. The longer answer is that it involves spins, flips and other terms that would make a gymnast sit up and pay attention.
Qubits are made of particles, the building blocks of reality that make up everything. They are manipulated using everything from lasers to electrical fields, helping to turn innocuous elements into data processing machines. The problem is that these particles are extremely sensitive to their environments. Any interaction with the external world, including thermal vibrations, electromagnetic waves, or even cosmic rays, can cause a quantum particle to lose its quantum state, a process known as decoherence. This phenomenon is largely responsible for the noise and errors that are characteristic of the NISQ era.
This problem can be solved through error correction, error mitigation and error detection. Collectively, these techniques allow a quantum computer to turn a collection of physical qubits into a smaller number of logical qubits. The exact number required per logical qubit varies depending on the type of qubit or the techniques being used. For trapped-ion qubits used by Quantinuum, this is likely to be fewer than 100. For superconducting qubits used by IBM, Google and others, it is expected to be roughly 1,000 to one. It will be higher still for photonics, but companies like ORCA are already seeing some advantage for low photon noisy machines in machine learning and generative AI.
What’s next for NISQ
Not all these companies agree with Khan on how soon we’ll reach the post-NISQ epoch — IBM, for example, maintains that we’re already in the era of quantum utility, and suggests their first logical qubits will emerge as early as 2025. Google, too, is making significant inroads in error mitigation and quantum machine learning using its superconducting qubit machine.
As we move forward, the quantum advantage will likely not be a sudden leap into error-free machines with millions of qubits, but a gradual process involving intermediate developmental stages. These may include machines that are error-free but have a lower qubit count, or machines with a larger number of physical qubits but effective error correction techniques. The future of quantum computing looks promising, and the noise is gradually starting to fade.
Partner content
What factors are driving data centre vendor selection? - Tech Monitor
Defining a Kodak culture for the future - The New Statesman
Brands must seek digital fashion solutions - Tech Monitor
The need to direct capital deployment to decarbonise household electrification - Capital Monitor
For what it's worth, there are interesting lines of research looking at what it means to use a quantum computer which is *partially* error corrected. I hope additional work develops along these lines in the coming years.
See, for example:
* "A framework of partial error correction for intermediate-scale quantum computers" - https://arxiv.org/abs/2306.15531
* "The battle of clean and dirty qubits in the era of partial error correction" - https://arxiv.org/abs/2205.13454
* "Error mitigation for universal gates on encoded qubits" - https://arxiv.org/abs/2103.04915