In our upcoming Future Human podcast, we tackle the future of quantum computing. Here, we publish the full interview with one of the guests, Kate Marshall, global technical ambassador lead for IBM Quantum.
Kate Marshall is a research scientist and engineer at IBM Quantum working on near-term quantum computing application development projects. She also leads a global team of technical quantum ambassadors who work directly with clients, academics and the general public to help them understand the industry impact of quantum computing.
“Our roles involve demystifying quantum and helping people understand why it’s important to them, and particularly support various industries in understanding the applications and use cases of quantum computing that are relevant to their domain,” says Marshall.
Why is quantum computing so important?
We start our conversation with the simple question of how she would describe quantum computing, and why does it matter. “It’s always a challenge of, how do you do this concisely and how you do it in a way that’s consumable for a broad audience, but for me, quantum computing essentially describes a fundamentally different type of information processing whereby we take advantage of the fundamental principles of quantum mechanics, like superposition and entanglement.
“It’s very easy to get stuck in the weeds and focus too much on the technical details. But for now, we can think of them as the spooky quantum mechanical phenomena that present themselves or are observable when you create these environments where the scale of the environment becomes either microscopically small and/or extremely cold.”
However, Marshall believes its less productive to focus on what quantum is, but rather how and why we will use these technologies.
“So, what makes them useful to us? Quantum computing is the opportunity for us to use these quantum mechanical principles to create problem spaces that are infinitely more complicated than any equivalent classical space we could create and can begin to therefore solve problems that have been previously considered to be intractable or impossible to solve within an unreasonable amount of time on even the most powerful classical compute resources,” she says.
“When I say that they’re infinitely more complicated, what I’m really alluding to is that we can create these spaces called quantum Hilbert spaces, and those are extensions to the three-dimensional Euclidean space that we live in, that we’re so familiar with. But it’s this space extended to millions of different state dimensions that scale exponentially with the number of quantum bits, or units of information that we have on our quantum processor.
“And as such, with a sufficiently large quantum computer and with sufficient control and error correction eventually employed, we hope to be able to build a machine that one day can create such an environment where we can start to solve problems that have been previously considered to be impossible to solve with our classical compute resource,” she says.
“We do eventually begin to find that it’s not every type of classically difficult problem that is relevant to being solved using some combination of quantum and classical compute, but it does so happen that the handful of problems that we hope to solve with this technology are extremely important, like maintaining a sustainable relationship with our planet, and building the health and wellbeing of our communities, that they really justify the pursuit of this very complicated and difficult technology that is quantum computing. Look at things like more efficient battery material development and novel drug discovery and then you can really start to see the impact of the sort of use cases we believe quantum computing will one day have.”
The three quantum eras
Marshall says it’s a very exciting time to be involved in the quantum industry. “Generally, I try to divide the timeline of where we are with quantum computing into three distinct eras. We’re currently sitting in a period that’s broadly known as this noisy, intermediate scale quantum computing, or the NISQ era, which essentially means that the devices we’re developing today are noisy, they are relatively immature, and there’s been no demonstration of what we would call quantum advantage, or some significant practical benefit over and above what classical computing could achieve today with our state of the art classical resources.
“When we are able to demonstrate such an advantage, we’ll move into the second period in this timeline, which is the quantum advantage era. People often forget that when we’re able to demonstrate quantum advantage in one application area, that would by no means suggest that we’ve solved quantum advantage or demonstrated quantum advantage for all applications. This will be an iterative process whereby we’ll slowly accumulate demonstrations of this advantage in various application domains. So that era of quantum advantage will stretch some period of time.
“And that’s by no means the end of the road era either. The third era is probably the most exciting of all, and that’s what we’re calling this fault tolerant era, and that will present more technical and engineering and scientific challenges than we’re even aware of today,” Marshall explains.
“We’re only scratching the surface at this point in time, but this will be an era when we’ve understood how to build what’s known as error-corrected quantum computers where, just like in classical computing, we build up redundancies so that we’re able to be resilient to noise. That means that even a few qubits (quantum bits) on any one device were to experience some errors, the system would still be able to return some somewhat accurate results for our problem.”
Quantum utility
I asked Kate about so-called quantum utility and if we are, as some suggest, currently in the quantum utility age.
“Quantum utility is a term that has only been coined recently, in the last 12 months or so. On the timeline that I’ve just described, there is a milestone that sits between where we are now and the beginning of this quantum advantage era. And that is this quantum utility concept. It’s basically where quantum computers are able to demonstrate, or in this case, in recent demonstrations, simulate a problem beyond the capabilities of just brute force classical computation using sufficiently large quantum computational devices. So, in this case, devices with more than 100 qubits,” she says.
Dr Abeba Birhane highlights the dangers of hyping up generative AI
Read more »
“The first example of this was actually published in a paper in Nature in June last year, alongside UC Berkeley, and it was titled, ‘Evidence for the utility of quantum computing before fault tolerance’. It was a novel moment and a momentous achievement, because it was the first demonstration where the narrative about research with quantum computing really changed direction, and the classical computing community has been able to compete with the results that we produce with a quantum computer, but only using carefully crafted problem-specific classical approximation methods.
“It’s really an indication of how close we are to demonstrating quantum advantage, and where we can hopefully begin to see quantum computing computers serving as a scientific tool to explore a new scale of problems beyond brute force, classical simulation. So, it’s an indication of how close we are to quantum advantage and ideally, we’ll be hoping to see some demonstration of that in the next few years. No one really knows exactly when, but the idea is that those who are able to harness this era of quantum utility will also be among the first to achieve real quantum advantage as well.”
Use cases and impacts
According to Marshall, the potential impact of quantum computing is wide and far reaching. “I speak to so many different clients in various different industries, and it’s very rare you come across an industry or a client where you say, ‘actually quantum computing has no application here’. The problems that are relevant to quantum computing generally sit in three broad categories of problems, and those are optimisation problems, simulation problems and machine learning problems. But these spread across multiple different industries, from financial services to material science, energy and utility sectors, and as far as healthcare and life sciences and even retail.”
IBM is going about the exploration of these application areas through working groups where it works directly with clients and partners to explore the far-reaching application of this technology.
“Some of these groups include healthcare and life sciences application-led groups. So, this includes involving organisations like the Cleveland Clinic and Moderna, where we’re exploring applications of quantum chemistry and quantum machine learning to challenge various different sorts of molecular discovery and drug discovery applications, and even using quantum machine learning-informed pipelines to be able to anticipate the risk of different patients following surgery,” says Marshall.
“We also have a group that’s focusing on material science applications, which is being spearheaded by a team at Bosch and the University of Chicago, together with companies like ExxonMobil, Rican and Oak Ridge National Laboratory, and they’re aiming to improve the best methods that we have to build workflows for material simulation.
“And then on the other side, we also have groups like the optimisation group, where we have a far-reaching collaboration across various global institutions like Eon and Wells Fargo, among others, to explore key questions that progress the identification of various different optimisation problems where we’re seeking a quantum advantage in sustainability, finance and far beyond that as well.”
Climate and sustainability are key concerns for many IBM clients, says Marshall. “We have worked in the past with various different companies on the development of efficient battery materials, so that involves low-level chemical simulation experiments trying to understand, for example, in the case of lithium composite batteries, the interaction between lithium and oxygen, because lithium oxide is a common compound used in battery materials, and so understanding the early-level interactions between those two molecules is actually really important should you wish to build sort of efficient battery materials that, if used in a car, could travel longer distances and could take less time to charge, for example.
“We also work with ExxonMobil on the development of catalytic materials that are able to operate at lower temperatures and therefore are less energy intensive in that way. So, there are active areas of application research in the climate sustainability area, but there’s also definitely room for more.”
Preparing a quantum workforce
While there remains uncertainty for the timelines when it comes to quantum advantage, there is always the danger that the industry will neglect to develop the necessary skills for a future quantum environment. I asked Marshall if she felt we had quantum-ready skills in place.
“I think right now, there’s a gap between where we are and where we need to be to be able to take full advantage of this technology, but that’s an opportunity, not something to be scared of,” she says.
“For sure, the workforce needs to be fostered and grown, and the tools and the software that needs to enable this workforce…is still in its infancy. It’s maturing every day, but we’re not exactly where we need to be eventually.
“But it’s exactly the right time to start acting. It’s the right time for companies to start upskilling their workforce. It’s the right time to start identifying application areas where they might be able to see return on investment in the future and strategic advantage that could position themselves in a place that could disrupt various industries. And it’s the right time to get your hands dirty with the available software and hardware tools that are being released and developed year on year.”
Marshall says this will be a really important time to prepare for an era of quantum advantage, when we’re able to deliver practical benefit over and above what any classical computer can deliver right now.
“In short, it takes, let’s say, five years to demonstrate that for an intractable classical problem, but it also takes at least five years to upskill a workforce to make the most of this technology. Well then, we really need to make moves to start preparing our workforce today…or even yesterday.”
Find out how emerging tech trends are transforming tomorrow with our new podcast, Future Human: The Series. Listen now on Spotify, on Apple or wherever you get your podcasts.