UII UPDATE 352 | APRIL 2024

Intelligence Update

Quantum’s quandary: racing toward reality or stuck in hyperbole?

Press announcements from quantum computing vendors often imply, with great fanfare, that their latest developments mark a turning point in research. This tends to spark a flurry of responses from competing companies, with each trying to show the market that they are making the fastest progress.

However, the fact remains: there are no guarantees that a practical quantum computer is even possible. While researchers remain hopeful, every increase in scale and each new technology introduces novel challenges that need solving.

If there is no real product in sight, why announce anything at all?

One important reason is for researchers to demonstrate their advances and show that they are adding value. Quantum computing research requires significant expenditure, and the return on investment will be substantial if a quantum computer can solve problems previously deemed unsolvable. However, this return is not assured, nor is the timeframe for when a useful quantum computer might be achievable.

To continue to receive funding and backing for what ultimately is a gamble, researchers need to show progress — to their bosses, investors and stakeholders. When one company announces an advancement, other companies feel compelled to follow suit or risk being perceived as falling behind.

Quantum computing announcements also have a broader remit — a halo effect that reinforces a company’s image as innovative and forward-looking. Some of the companies claiming they lead the way in quantum computing research, such as Google, Amazon and Microsoft, want to be seen as technology pioneers — being revolutionary is a key part of their brand identity. If and when quantum computing becomes practical, an established reputation might aid commercial success.

Research is still continuing

Quantum bits (qubits) are the building blocks of the language by which information is stored and manipulated in a quantum computer: the more qubits, the greater the combination of values that the computer can use for calculations. To solve real-world computational problems with quantum algorithms requires many useful qubits (holding the correct quantum information) — see New quantum cloud region signals increased commercial focus for a more in-depth explainer.

Unfortunately, as the number of qubits increases, so does the complexity of errors due to quantum noise and circuit control issues. Developing fault-tolerant (i.e., reliable, in classical computing parlance) quantum computers free from computational errors with thousands of qubits is the primary objective of quantum computing research.

Despite these fundamental difficulties, the nascent quantum computing industry appears more buoyant than ever. Beginning in December 2024, several major players announced research breakthroughs in quick succession, specifically regarding a reduction in errors while dealing with more qubits and greater computing power.

Google introduced the Willow processor, a 105-qubit superconducting quantum computing chip that exponentially reduces errors as the number of qubits scales. Willow completed a benchmark computation (the emulation of quantum circuits) in under five minutes. According to Google, this task would take current supercomputers 10 septillion years, which is billions of times longer than the age of the universe.

Then Microsoft announced Majorana 1, the world's first quantum chip powered by a topological core architecture. This development leverages Majorana quasiparticles — which behave like their own antiparticles — to create qubits that are resistant to errors.

Amazon followed with the unveiling of its first quantum computing chip, Ocelot. The chip leverages bosonic quantum error correction using "cat qubits," a novel approach designed to significantly reduce the resources needed for error correction — potentially cutting them by up to 90% compared with traditional methods.

Next, D-Wave Quantum announced that its quantum computer had outperformed one of the world's most powerful classical supercomputers in solving complex simulations. The company claims its annealing quantum computer performed magnetic materials simulations in minutes — tasks that would take nearly one million years and use more power than the world’s annual electricity consumption if solved using a classical supercomputer built with GPU clusters.

Most recently, Q-CTRL, working with Nvidia and Oxford Quantum Circuits, claimed a significant reduction in classical compute costs by using accelerated GPUs for quantum error suppression.

Beyond technical innovations, there are signs of increasing government, academic and corporate interest. IBM and the Basque government announced plans to install Europe's first IBM Quantum System Two at the IBM-Euskadi Quantum Computational Center in Spain. In addition, IBM recently launched a new quantum data center and cloud region in Germany.

The value of announcements

As soon as such announcements are made, scientists and researchers scrutinize them for weaknesses and hyperbole. The benchmarks used for these tests are subject to immense debate, with many critics arguing that the computations are not practical problems or that success in one problem does not imply broader applicability. In Microsoft’s case, a lack of peer-reviewed data means there is uncertainty about whether the Majorana particle even exists beyond theory.

The scientific method encourages debate and repetition, with the aim of reaching a consensus on what is true. However, in quantum computing, marketing hype and the need to demonstrate advancement take priority over the verification of claims, making it difficult to place these announcements in the context of the bigger picture.

Should the data center industry prepare for quantum computing? It is likely too early to consider quantum computing for commercial applications. Progress is being made, albeit slowly. Even if a practical quantum computer is created, it will take years for them to become mainstream. Claims will need to be verified, hardware will need to be manufactured and supply chains will need to be developed. Furthermore, governments will likely intervene due to the potential national security implications.

According to the experts Uptime Intelligence consults, a useful quantum computer is still, at least, a decade away. When this analyst first covered quantum computing in 2017, experts were saying the exact same thing. The fact is, no one knows for sure.

About the Author

Owen Rogers

Owen Rogers

Dr. Owen Rogers is Uptime Institute’s Sr. Research Director of Cloud Computing. Dr. Rogers has been analyzing the economics of cloud for over a decade as a product manager, a PhD candidate and an industry analyst. Rogers covers all areas of cloud, including economics, sustainability, hybrid infrastructure, quantum computing and edge.

Posting comments is not available for Network Guests