1 Comment

Thanks for the write-up!

IMO (and in no way representing my employer!), one of the primary drivers for the shift in the field's conception of advantage is the rise of error mitigation. Instead of trying to do error correction -- and incur substantial overheads to distill logical qubits -- error mitigation allows us to "virtualize" the effect of the noise, by simulating a less-noisy quantum computer through the use of an ensemble of circuits and classical post-processing. In that sense, we trade off needing more and higher-quality qubits (correction) for needing to run more circuits (mitigation).

Back when I was in grad school, error mitigation wasn't a thing. (Mike and Ike doesn't talk about it.) So when learning about quantum then, the focus was definitely on fault-tolerant algorithms, etc. And 'advantage' definitely had some flavor of "wait for fault-tolerance'.

But with the advent of error mitigation, the calculus changes. Although a full and complete theory of error mitigation doesn't yet exist, and the required overheads (size of the ensemble of circuits) can be high, the technique does give an alternative path to making quantum computers useful.

(See this nice review on the arXiv: https://arxiv.org/abs/2210.00921)

Expand full comment