Quantum Computing—The Next Technological Transformation

We’re on the brink of entering into an era of rapid and dramatic change in computing technology Tony Pistilli and Chris Rohde

Neural networks, the machinery behind artificial intelligence (AI), have been around for decades. As early as 1975, the design and mathematics that power modern AI were well documented; however, computers lacked the processing power to implement those designs.

In 1989, Bell Labs built what has since become a first project for many students—a neural network to identify handwritten digits. Following that breakthrough, neural network research blossomed in the early ’90s but hit a wall in the mid-90s—again due to a lack of computing power. It was another 15 years until neural networks were widely established and became the revolutionary tool they are today, all for the simple reason that computers became fast enough to implement them. In a way, AI wasn’t the breakthrough that ushered in the data revolution of recent years—it was the economization of laptops that could run AI models that was the real breakthrough.

Quantum computing is a technological advancement that appears to be following a similar trajectory. MIT professor Peter Shor first demonstrated the power of quantum computing in 1994 when he showed that finding the prime factors of a large number (a mathematical problem solvable only by guess-and-check powered by raw computing power) could be done much more efficiently on a quantum computer than a classical computer. It was not until 2001 that scientists from IBM and Stanford built a quantum computer that correctly factored the number 15—a monumental engineering feat, despite the problem being easily solvable by a fourth grader.

After an 18-year lull, quantum computing regained the focus of researchers and futurists again with Google’s October 2019 announcement that its quantum computer had completed a calculation in 200 seconds, which would take today’s most powerful supercomputers approximately 10,000 years to finish. This milestone marked the first time a quantum computer solved a problem that is practically unsolvable by classical computers.

Like neural networks, many predict that engineering advances are the key barrier to advances in quantum computing, and that we are on the brink of moving beyond those barriers and entering into an era of rapid and dramatic change in computing technology.

What Is a Quantum Computer?

Classical computers (i.e., the laptops we type on) are powered by bits, information units in the form of electrical or optical pulses that have values of either zero or one. Quantum computers use qubits instead. Qubits typically are subatomic particles like photons or electrons, and instead of being either zero or one, qubits can represent numerous possible combinations of zero and one simultaneously. This simultaneous property is called “superposition,” and it gives a single qubit the capacity to act like two bits. That’s weird, but so are most physical realities at a subatomic level.

The real power of quantum computing is the ability to “entangle” qubits with each other by linking the state of two qubits so that changing one qubit instantaneously changes the other one, even if the qubits are very far apart. Even Einstein described this physical reality as “spooky action at a distance,” so don’t worry if you’re lost at this point.

The practical implication of this “spooky action” is that, unlike conventional computers where bits and processing power are linearly related (doubling the number of bits results in double the processing power, simplistically), quantum computers grow exponentially more powerful with each qubit added—a one-qubit computer acts like two bits, a two-qubit computer acts like four qubits, four qubits act like 16, and so on.

Qubits that are entangled are said to be in a state of “coherence,” and they can lose their coherence easily (called decoherence). Decoherence eliminates the scaling effects that make quantum computing so powerful and the computer’s results unusable. Decoherence is caused when the large outside world enters the small insulated world of the entangled qubits. Vibrations, temperature fluctuations, light and other environmental interference are all culprits, so quantum computers are kept in very cold and heavily managed environments to insulate them from these effects. Increasing the number of qubits and the length of time they are held in coherence is the key barrier to the growth of quantum computing technology.

Engineering Difficulties

David DiVicenzo proposed a list of criteria in the late ’90s that quantum computers would need to meet to be useful. They can be summarized into two main challenges: the ability to increase the number of qubits in a scalable fashion and the ability to control the qubits.

Coherence becomes more difficult to maintain as the number of qubits increases. It is sort of like pushing multiple kids on swings at the same time: It is easy to keep two kids moving in parallel, but adding even one additional swing to the system becomes significantly more chaotic as you rush between the three swings. Pushing four or five kids in parallel would be nearly impossible. The swing system is not scalable—after a certain point (two kids), adding another is significantly more difficult. A major engineering hurdle for quantum computers is finding ways to let the number of qubits increase without the relative difficulty of holding them in coherence increasing beyond our ability to manage.

The second main challenge is controlling the qubits. A significant area of research in the 2010s was narrowing down the most appropriate physical phenomena out of which to build qubits. The pool of potential candidates is large: the spin or charge of electrons, the orbital level of electrons in an ionized atom and many other physical realities can act as qubits. Finding physical realities that are small (therefore easy to control) and have measurable features that take a wide range of values (easier to measure) are key goals.

Controlling qubits requires new ways of designing computing algorithms as well. Traditional computing uses three basic “gate operations” (AND, OR and NOT) to replicate any function that turns multiple inputs into a single output (note that in practice, simplifications of the combination of these gates are used more efficiently than the three basic operators themselves). These gates are the physical building blocks of a computer—the electrical pathways on a central processing unit (CPU) are a series of billions of transistors that perform these gate operations.

Quantum computing involves a much richer set of basic gate operations—many more than three. Like classical computing, understanding the theoretical operation of the quantum gates is a math exercise. However, unlike classical computing, the practical operation of quantum gates is not as well understood, because the physical building blocks performing the gate operations are much more complex. Determining how to optimally combine quantum gate operations is an open area of research to address questions such as:

  • Is it best to use the fastest gates to race against decoherence, or should the most stable gates be used to avoid instability?
  • Is the full set of quantum gate operations needed, or can a select few be used to replicate the others?
  • When using quantum gates to monitor errors from other gates, is the reduction in error from the monitoring more than the inclusion of additional decoherence due to the added gates?

Potential Impacts of Quantum Computing

Data Encryption

Today’s encryption mechanisms, broadly speaking, work by exploiting the fact that it is easy to multiply two large prime numbers together, but it is difficult to decompose that product back into those two numbers. For example, it’s easy to calculate that 3,877 x 6,569 = 25,468,013, but starting with that eight-digit number and figuring out the two four-digit numbers is much harder, unless you have access to the data of one of the four-digit numbers—with that information, finding the second four-digit number is easy. This mathematical reality creates a way to allow designated users through while keeping adversaries out.

An adversary using a classical computer does not have much recourse other than trying to divide by every prime number—2, 3, 5, 7, 11, 13 and so on. Our eight-digit example would be easy to crack, but a 16-bit number would take about 100 times longer to crack than an eight-bit number with classical computing, with similar scaling effects as the number of bits continues to double. The “64” in “64-bit encryption” refers to how many digits are in the big number.

However, quantum computing and its own scaling effects remove the ability to rely on big numbers to ensure security. Researchers in 20151 predicted that a quantum computer capable of breaking modern encryption would be available by 2026 with a 1 in 7 probability, and by 2031 with a 1 in 2 probability. More recent updates to this timeline2 are slightly less urgent, but they spell a Y2K-level problem that is unlikely to be as fleeting as the original.

Drug Design

Computers increasingly are key components in the design of new pharmaceutical products. Biologic drugs are derived from living cells—unlike traditional drugs, which are chemically engineered—and have grown in use exponentially in recent years. In 2018, only four of the top 15 grossing pharmaceutical products were not biologic drugs. The processes that create these drugs are immensely more complex than traditional chemically engineered drugs because biologic drugs are derived from living cells.

At a very high level, biologic drug development relies on a “natural selection” process—after finding a bacterium that creates a protein that has a therapeutic use, researchers allow that bacterium to mutate, test the efficacy of the evolved group of bacteria, select the best candidates and repeat the process. Conventional computers can model some of the systems that create these proteins and take the guess-and-check aspect out of the process, but most models are out of reach even for the most powerful systems available today. Something as simple as modeling a penicillin molecule requires an impossibly large classical computer. Quantum computing could make this type of modeling more practical and lead to faster and more innovative drug development.

Even the development of more traditional pharmaceutical products is increasingly leveraging machine learning and would be impacted by the immediate expansion of computing power a quantum breakthrough would offer. In February 2020, MIT researchers announced they had discovered a new antibiotic compound that killed strains of bacteria that are resistant to all known antibiotics. The drug was discovered using a machine learning model that scanned hundreds of millions of chemical compounds looking for those with chemical structures that enable a drug to kill bacteria. This was an especially important breakthrough, as antibiotic-resistant drugs pose a growing threat to humanity, yet research into new antibiotics largely has stalled due to limited market opportunity and difficulties finding new drugs. Efficient quantum computing could represent an opportunity to significantly reduce drug development costs while simultaneously discovering new and innovative uses for existing chemical compounds.

Climate Change

Quantum computing promises to aid a future increasingly impacted by climate change. Its ability to handle multiple variables simultaneously promises better weather predictions, a modeling problem popularly characterized by “the butterfly principle” (a butterfly flapping its wings in South America can cause a hurricane in North America—it’s not quite true but illustrates the compounding and complex interactions of weather events).

A number of car manufacturers have begun investigating uses for quantum computing that would change the way we drive. Daimler AG (the parent company of Mercedes-Benz) announced in 2018 that it had partnered with Google and IBM to look at uses of quantum computing to improve the batteries of electric cars. The company also is looking at the ways quantum computing powered by AI could handle the logistics of managing traffic on roads filled with autonomous vehicles. Volkswagen also has been investigating traffic-related optimization in cities. This type of modeling could result in more efficient public transportation and reduce travel times in densely populated places.

Illustrating the varied uses of quantum computing, Microsoft researchers have been looking at ways to more efficiently produce fertilizer to slow climate change. Nearly 3 percent of annual global energy consumption and 1 percent of greenhouse gas emissions today are spent on the Haber process, a method in which atmospheric nitrogen is heated and pressurized into ammonia. However, some bacteria are able to perform the process naturally—we just can’t figure out how they do it. Microsoft was able to isolate the molecule in those bacteria that would be necessary to simulate so we could understand that process. Now, we just need to wait for a quantum computer that can do the work.

Conclusion

Charles Holland Duell is famously, though likely apocryphally, attributed as saying while commissioner of the U.S. Patent Office in 1899, “Everything that can be invented has been invented.” It’s easy to pillory the statement: The 1901 patent on the assembly line, the 1928 discovery of antibiotics, the 1938 invention of the xerograph (a precursor to the modern printer), the 1942 initiation of a self-sustaining nuclear chain reaction, the 1969 workable prototype of the internet and the 1973 marketing of the personal computer only begin to form a rebuttal.

But it’s important to note that each of those technologies were critical not just in and of themselves, but for the way they interacted and the additional technologies they enabled. At least four of the technologies mentioned in this article have enabled this article to reach a broad audience. Duell was shortsighted to think the Patent Office would be a dull place in the 20th century, but perhaps even more significantly, his bleak view of the future missed the ways in which the technological process moves much more dynamically than the sum of recent inventions alone would indicate.

In a similar way, quantum computing represents an important advancement in computing technology—we’ll be able to do the things we do today faster, better and more robustly. But even more exciting are the currently unimaginable technologies quantum computing could enable. Quantum technology could usher in a new era of engineering, medical and social change that far outweighs the impact of really fast computers.

Just as the job of actuaries has changed dramatically in the last 50 years because of key technological shifts, a quantum computing breakthrough will produce similarly dramatic changes. But in the same way, the skills actuaries brought to the table in 1970 and 2020 are not all that different. And we predict that in 2070, there will never have been a more exciting time to be an actuary.

Tony Pistilli, FSA, CERA, MAAA, is director of Actuarial Services and Analytics at Optum, leading data-driven concept research and development for Optum’s Payment Integrity products. He is also a contributing editor for The Actuary.
Chris Rohde, ASA, MAAA, is an actuarial consultant at UnitedHealthcare, where his work focuses on Medicare Advantage bid pricing and model development.

Copyright © 2021 by the Society of Actuaries, Schaumburg, Illinois.