Quantum Untangled - The Changing Face of 'Advantage'
In-conference excitement and jitters at Commercialising Quantum
Last week I attended the second annual Commercialising Quantum conference in London. It was an opportunity for companies and organisations working in this space to examine the path to commercialisation, and examine the ways in which money is already being made from quantum systems. As you might imagine, there were also more than a handful of sales reps on site.
The event included a main hall, a networking hub, a second conference room for smaller sessions and a series of private rooms where companies like AWS and IBM would hold invite-only panels. Surrounded by tables full of pastries and cereals was the networking area. Some of the biggest startup players in the space had stands, including IonQ, which recently put its full-stack Aria system on AWS Bracket. Quantinuum also drew a large crowd, likely drawn in by the recent discovery of the first topological qubits.
This was the first quantum conference I’ve been to in about a year and the first purely focused on commercialisation. Like much of modern day quantum technology, it felt like a cross between an elongated sales pitch and a series of academic lectures. One of the things I like to do at these get-togethers is look for messages and ideas that permeate throughout the event. This year, that message was very simple indeed: the definition of quantum advantage has changed.
Fresh from their topological qubits breakthrough, the Quantinuum stall was especially busy. (Photo courtesy of Commercialising Quantum)
Time was that ‘advantage’ referred to the moment in the field’s glistening future when quantum machines will have amassed hundreds of thousands of error-free qubits, thereby asserting an indisputable authority over their comparatively decrepit classical cousins. Those ambitions have now been tempered. The conclusion among the experts I spoke to at Commercialising Quantum was that advantage doesn’t need to be equated with thousands of qubits, but just a handful — enough, in short, to perform a given task at a pace that’s faster and more efficient than a computer running on simple bits and bytes. Microsoft calls this ‘practical quantum advantage.’
‘Okay,’ I hear you shout, ‘so how long before we get to that point?’ Even as recently as three years ago, the answer from most quantum researchers referred to decades-long timescales. Now, the spiel from big tech firms and their quantum sales reps is that ‘commercial advantage’ is just five or 10 years away. That might still sound like a while to most of you, but for the corporate exec diligently planning their future quarters, that’s a rapidly approaching deadline.
“You can't be a large organisation and not have already readied yourself,” says Kristin M. Gilkes, EY Global Innovation’s Quantum Leader. If major companies are serious about quantum, after all, it means being serious about creating new algorithms that run on quantum hardware, weaving quantum encryption into their communications, and training their workforce to anticipate their new, quantum-powered realities. “It isn’t too late” to bone up on this kind of stuff, argues Gilkes, “but they should call me quickly.”
Quantum Kool-Aid, Quantum Winter
That sense of urgency is complemented by another observation that I heard repeatedly that day, whether during pre-interview chit-chat or waiting for my fully plant-based lunch: that fewer qubits are needed to solve real world problems than we all first thought. When you consider that we now have a greater understanding of how quantum algorithms work than ever before, and factor in the (gleefully optimistic) view among hardware developers that we will have usable, logical qubits within five years, the concept of commercial advantage for enterprise comes into stark relief.
During his talk IBM’s Jay Gambetta shared a slide extolling the need for a ‘no nonsense’ approach to quantum computing and the iterative path to get to that point (Image: Ryan Morrison)
That is, of course, if you believe every one of those factors will pan out in exactly the way that quantum researchers and sales reps hope they will. Deborah Berebichez isn’t so sure. “I do fear the consequences of over hype in the media,” the founder of deep tech consultancy Solve For You told delegates.
Things got more pessimistic as the day wore on. Some delegates wondered aloud if quantum computing, once the darling of the tech VC landscape, was now losing to much ground in the attention wars to generative AI - whether, in fact, investors see ChatGPT-related technologies as a better way of getting a return on their investment than a fragile computing technology with few existing practical applications.
IBM says it will have a 100,000 qubit quantum supercomputer by 2033 (Image: IBM)
This feeling, said the director of the UK National Quantum Computing Centre Michael Cuthbert, is being fuelled by ructions in the global economy. The supply chain shocks from the pandemic and Russia’s invasion of Ukraine are still reverberating, and the cost of borrowing has only gone up and up over the past year and a half. All these factors, said Cutherbert, mean that as far as the main investment challenges for quantum are concerned, “the issues are in securing follow-on funding beyond the first round.”
Put a quantum spring in my step, why don’t you
Is a quantum winter in the offing, then? Not necessarily, said Cuthbert (cue audible (and, for legal reasons that Tech Monitor is keen to point out, fictional) sighs from the quantum sales reps seated in the audience), but the sector does “need to justify our investment efforts more rigorously than before.”
There was also a general sense that quantum isn’t representing itself very well – too much focus by industry advocates on the inner workings of a quantum computer, and not enough chatter on how such machines could be put to good use, or even how they’re being used already. “There is a risk that, when talking to the public, [quantum computing] becomes science fiction,” said Cuthbert. “Work needs to be done to bridge the gap and explain all the benefits from tangible use cases.”
One suggestion was the need for an independent metric created by a trusted government group or consortium like NIST, which is already managing the creation of post-quantum cryptography standards. “We have a programme to look at standards and comparisons across different quantum modalities,” said Cuthbert. “We, with other national institutes, can be a trusted voice to make cross comparisons”
Where does that leave quantum computing’s commercial future? Is this, for the moment, the best it’s ever going to be for this decade - or, is a field that’s spent decades developing quietly in the background, one research breakthrough at a time, a little too prone to self-criticism?
I can’t answer those questions yet, but in my conversations with quantum software, hardware and cryptography companies throughout that day, I got no sense that progress toward practical quantum computing was slowing down. Peering down on delegates from large screens in the main hall, industry icon Hermann Hauser enthused at the level of investment from both the public and private sectors, while shortly after, from a small stage in the same hall, IBM’s Jay Gambetta extolled the benefits of the company’s roadmap to a 100,000 qubit machine by 2033. Conversations hummed about ‘interim solutions’ and ways to use even the noisiest of QPUs to gain a commercial advantage. It didn’t feel like any winter I’ve encountered. It felt like spring.
Tech Monitor aims to provide an essential resource for the world's top CIOs, business technology leaders and the tech-curious in a time of dramatic digital transformation. We are here to help you collaborate, innovate and connect.
Thanks for the write-up!
IMO (and in no way representing my employer!), one of the primary drivers for the shift in the field's conception of advantage is the rise of error mitigation. Instead of trying to do error correction -- and incur substantial overheads to distill logical qubits -- error mitigation allows us to "virtualize" the effect of the noise, by simulating a less-noisy quantum computer through the use of an ensemble of circuits and classical post-processing. In that sense, we trade off needing more and higher-quality qubits (correction) for needing to run more circuits (mitigation).
Back when I was in grad school, error mitigation wasn't a thing. (Mike and Ike doesn't talk about it.) So when learning about quantum then, the focus was definitely on fault-tolerant algorithms, etc. And 'advantage' definitely had some flavor of "wait for fault-tolerance'.
But with the advent of error mitigation, the calculus changes. Although a full and complete theory of error mitigation doesn't yet exist, and the required overheads (size of the ensemble of circuits) can be high, the technique does give an alternative path to making quantum computers useful.
(See this nice review on the arXiv: https://arxiv.org/abs/2210.00921)