-
CENTRES
Progammes & Centres
Location
Quantum computing promises revolutionary breakthroughs, but exaggerated claims and premature hype risk eroding public trust before the technology becomes truly useful
Despite decades of research and investment, progress in quantum computing remains hindered by its Achilles' heel: the lack of practical applicability. Even by the most optimistic estimates, commercially relevant quantum computers are “at least a decade away,” a statement which has been reiterated time and again. Although recent developments in quantum error-correction and hardware are certainly encouraging, practical large-scale quantum computers are still likely a long way off. However, numerous false and superficial assertions were made regarding the technology. In addition to being disingenuous in many cases, this also has the added impact of severely undermining public confidence in an otherwise blossoming scientific endeavour. There is an urgent need for the field to undergo a reality check while dispelling the rumours and false claims that plague it, to clearly recognise and address the major obstacles being faced by the technology.
Despite decades of research and investment, progress in quantum computing remains hindered by its Achilles' heel: the lack of practical applicability.
While classical computing relies on bits, which can be either a 0 or 1, quantum computers employ qubits, which can assume a superposition of values between 0 and 1. Similar to digital circuits in classical computers, quantum circuits using quantum logic gates are employed to build quantum computers. The chief impediment to quantum computing comes from quantum decoherence or “noise,” due to the interaction of qubits with the external environment, which causes the information carried by them to decay over time. This is addressed by using Quantum Error Correction (QEC), which operates by combining multiple physical qubits into error-corrected “logical” qubits.
Figure 1: Understanding Qubits

Source: Scientific American
Quantum computing has entered the era of Noisy Intermediate-Scale Quantum (NISQ) computers, which, for the first time, enable the execution of tasks that are too complex for even the most powerful classical supercomputers. However, these tasks currently focus mainly on benchmarking and have limited practical utility. As the technology continues to develop, NISQ computers will eventually give way to larger Fault-Tolerant Application-Scale Quantum (FASQ) computers, making practical applications of quantum computing feasible across a wide range of fields.
However, several hardware and algorithmic constraints before FASQs can be built, with QEC being chief among them. Broadly, QEC involves running quantum computations repeatedly while extracting error syndromes. The error syndrome history is subsequently fed to a classical computer, which suggests a recovery step to remove most of these errors, a process known as syndrome decoding. The basic issue with QEC stems from the fact that the process of extracting errors from quantum circuits itself leads to error propagation within them. This is further compounded by the complexity of syndrome decoding algorithms, which must be time-efficient to maintain parallel operation with the quantum computations.
Quantum computing has entered the era of Noisy Intermediate-Scale Quantum (NISQ) computers, which, for the first time, enable the execution of tasks that are too complex for even the most powerful classical supercomputers.
One of the most promising avenues for QEC is the surface code due to its ability to tolerate noise and relatively simpler syndrome extraction. However, even surface codes require roughly 1,000 physical qubits for every logical qubit, which is beyond the scope of current devices. Nonetheless, recent advancements such as Google’s Willow processor have shown promising results, thereby spurring optimism for the future of surface codes. Other emerging avenues of QEC, such as quantum low-density parity-check (qLDPC), fluxonium qubits, and Cat qubits, also appear promising, though they come with their own set of caveats.
Despite these advancements in QEC, the transition towards FASQs will likely take a long time to fructify. In the meantime, Quantum Error Mitigation (QEM) is much more practically feasible for NISQs. QEM algorithms such as Zero-Noise Extrapolation (ZNE) and Probabilistic Error Cancellation (PEC) involve sampling quantum circuits and subsequently applying classical post-processing to the outcomes. The basic problem with QEM is that it scales unfavourably with circuit size as compared with QEC, making it less relevant for FASQs. However, it can be extremely effective from the perspective of NISQs and may eventually lead to useful practical applications.
Although current NISQs have achieved so-called “quantum advantage”[i] for certain tasks such as random circuit sampling, they are of limited practical utility. Most ongoing efforts in developing quantum algorithms are focused on Variational Quantum Algorithms (VQAs), which utilise a hybrid approach involving a combination of quantum and classical computers and have potential applications in optimisation and machine learning. However, these algorithms are plagued by multiple issues, such as the barren plateau phenomenon, which has led to enhanced focus on executing subtasks within VQAs through techniques such as the Quantum Approximate Optimisation Algorithm (QAOA) and novel paradigms such as Decoded Quantum Interferometry (DQI), which seem promising. On the other hand, the algorithms with proven quantum advantage, such as Grover’s and Shor’s algorithms, will require FASQs.
When it comes to quantum simulations, their main utility lies in the fact that they can, in principle, be used to simulate many-particle, strongly correlated systems[ii] which classical computers tend to struggle with, particularly as the systems get larger. Despite steady progress in this domain, the main challenge lies in demonstrating quantum advantage, a recurring issue. While it is almost certain that quantum simulations will provide new scientific insights even in the NISQ era, they will find broader commercial relevance only once FASQs become a reality.
The field of quantum computing has been tarnished by a plethora of questionable and, in some cases, outright false claims. These range from “achieving quantum advantage” to “quantum AI” and a host of other misleading and manipulative announcements. While many of these claims originate from genuine discoveries and advancements, a significant portion is designed to create artificial hype. This tactic can spur investments and inflate company stock prices, reflecting a lack of honesty in academic practices. As a result, it has become increasingly difficult to distinguish between facts and fiction.
Several prominent quantum computing companies, such as IonQ, Rigetti, and D-Wave, have profited enormously over the past few years while utilising questionable financial tactics such as Special Purpose Acquisition Companies (SPACs). Given the aforementioned reasons, commercially useful applications of quantum computers are virtually non-existent at the moment. Consequently, a large portion of the profitability of these companies relies on consulting or “teaching” other businesses (such as through Proof-of-Concept demonstrations) how quantum computers will aid them in the future and fuel speculation. For instance, in March 2025, the D-Wave research team made a substantial claim stating it had achieved quantum advantage using its Advantage2 processor by performing a magnetic materials simulation that would take a classical supercomputer “one million years” to solve. This led to an immediate 10 percent stock price increase for D-Wave, alongside similar trends for other companies, including IonQ, Rigetti, Quantum Computing, and Arqit Quantum. However, the claim was challenged almost immediately by multiple research teams who claimed to prove that the same result could be achieved easily by employing classical computers.
Several prominent quantum computing companies, such as IonQ, Rigetti, and D-Wave, have profited enormously over the past few years while utilising questionable financial tactics such as Special Purpose Acquisition Companies (SPACs).
Even big tech corporations are guilty of eagerly reporting misleading results without conducting the necessary benchmarking and verification. Google claimed quantum advantage as far back as 2019, while IBM did the same in 2023. Microsoft's announcement of its Majorana 1 chip in February 2025 faced significant criticism for making exaggerated claims. The company had already been criticised for making similar claims in a 2018 paper, which it later retracted.
Another prominent avenue for creating hype and speculation comes from the repeated claims of encryption-breaking quantum computers. Most current encryption protocols, such as Rivest-Shamir-Adleman (RSA), are based on the problem of factorising large prime numbers, which is particularly difficult for classical computers. On the other hand, quantum computers can perform the task very quickly using Shor’s algorithm. However, as mentioned earlier, implementing Shor’s algorithm requires FASQs, which are nowhere near fruition currently. Nevertheless, there have been multiple claims regarding successful quantum factorisation. For instance, in 2024, a group of Chinese researchers claimed to have broken military-grade RSA-2048 encryption using a D-Wave quantum computer, which was subsequently widely refuted. In particular, in a recent satirical paper titled “Replication of Quantum Factorisation Records with an 8-bit Home Computer, an Abacus, and a Dog,” computer scientist Peter Guttman and his co-author proved how to replicate the computation with a VIC-20 8-bit computer from 1981. The authors have also stated that quantum factorisation has not been successfully achieved for any number beyond 21.
From being conceived as a mere thought experiment in the early 1980’s, quantum computing has come a long way both in terms of hardware and algorithms. However, despite the exciting scientific possibilities it presents, commercial applicability remains an uphill battle. Though this is the case with most novel technologies, the artificial hype and misleading claims surrounding quantum computing have the potential to diminish public confidence in an otherwise promising scientific endeavour. Perhaps one of the biggest concerns is the potential creation of a quantum computing financial bubble, which will eventually pop if the current attitude continues, severely harming investments in the field in the future, which are critical for the emerging technology to evolve.
Despite the exciting scientific possibilities it presents, commercial applicability remains an uphill battle.
Addressing the issue will require a collective effort from academia, the private sector, and media outlets. Academia needs to be more careful in publishing quantum computing literature, which, in turn, requires better vetting and peer-review processes in addition to ensuring that corporate-backed research groups do not enjoy an unfair advantage. The private sector needs to tone down its habit of prematurely professing quantum advantage without the necessary benchmarking and verification. While this may seem like a good idea for accelerating temporary investments, it can have crippling consequences in the future, threatening to derail those very investments. Though proving quantum advantage is in itself a complicated task and occasional errors are understandable, the sheer volume of such claims is quite unwarranted. Recent initiatives such as the Quantum Advantage Tracker and the United Kingdom’s QCMet Project are encouraging steps in this regard. Additionally, responsible reporting by journalists and media outlets is also critical for ensuring technology's prosperous future.
Prateek Tripathi is an Associate Fellow at the Centre for Security, Strategy and Technology (CSST), Observer Research Foundation.
[i] Quantum advantage refers to the ability of a quantum computer to perform a certain task more efficiently than is possible with a classical computer.
[ii] Strongly Correlated Systems refers to materials in which electron interactions are strong enough to affect their collective behaviour.
The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.
Prateek Tripathi is an Associate Fellow at the Centre for Security, Strategy and Technology. His work focuses on an emerging technologies and deep tech including quantum ...
Read More +