Computing
Are Quantum Qubits Overrated? The Rational Physics Debate

Quantum computers are both the most promising and the most confusing segment of innovation in computing. On one hand, quantum computers promise to perform calculations that would otherwise be utterly impossible, and seem at times to break every rule and limitation of normal computers.
On the other hand, they are extremely difficult to build and to scale up their computing power to useful levels. And there is still much we do not understand about quantum physics, leaving the concept of quantum computers vulnerable to unexpected surprises. For example, a proper theory of quantum gravity has stayed elusive for decades, potentially pointing to a deep flaw in our understanding of quantum mechanics.
This last idea of fundamental limitation from quantum physics itself has recently been elaborated further by Tim Palmer, a researcher at the University of Oxford, best known for his work on chaos theory and climate.
He thinks that fundamental mathematical properties of quantum space might inherently limit the actual capacities of quantum computers, far more than previously thought.
He published his study in the prestigious scientific journal PNAS1, under the title “Rational quantum mechanics: Testing quantum theory with quantum computers”.
Understanding the Hype: How Do Quantum Computers Work?
Before discussing Professor Palmer’s idea, it can be useful to understand what makes quantum computers special.
The key part is that instead of “discrete” bits with values of 1 & 0 like a normal computer, quantum computers’ qubits display quantum superposition and entanglement.
In simplified terms, this means each qubit can inherently store more complex information at once, making calculations with complex mathematical matrices easier.
So for a complex data set with many possible values for each data point, such as the spin values of electrons or atoms in a chip or a battery electrode, quantum computers can handle the mounting complexity, with each added qubit exponentially increasing capacity.
In contrast, a normal computer only adds one new capacity at a time, one new bit at a time, so a calculation that becomes exponentially more complex each time a new datapoint is added becomes quickly unmanageable, with the quickly multiplying complexity overwhelming the capacity of even the best normal supercomputer.
At least that is the theory, supported by mainstream concepts of how classical quantum physics works. But Pr. Palmer is arguing this is not the case.
Quantum Mechanics vs. Rational Quantum Physics (RaQM)
What is Hilbert Space? The Framework of Quantum Power
The “mainstream” concepts of quantum physics are generally grouped under the term “quantum mechanics” (QM) and describe the complex, often counterintuitive phenomena occurring at the quantum scale.
A key element relevant to quantum computers is the idea of Hilbert space. This concept expands the familiar 2D or 3D space to any number of dimensions and creates the mathematical framework on which most quantum physics is built.
“Hilbert space is a mathematical concept in linear geometry that defines an infinite-dimensional space. In other words, it takes geometric concepts that are limited to dealing with two- and three-dimensional spaces and expands them so that they can be used with an infinite number of dimensions.”
Because it is such a fundamental tool of quantum physics, it is rarely questioned. And it is certainly a “true” idea in general, as it made possible most of the predictions of quantum physics that have been confirmed experimentally.
“Hilbert spaces are crucial in fields such as quantum mechanics, where they provide the mathematical framework for understanding the behavior of particles at microscopic scales. This includes applications in solving complex equations like Schrödinger’s equation, which describes how quantum systems evolve over time. ”
In its classical interpretation, the number of dimensions in a Hilbert space grows exponentially with the number of qubits used by a quantum computer. This interpretation entirely depends on the continuum nature of Hilbert Space, which is the idea that Pr Palmer is challenging.
Rational Quantum Physics: Challenging the Continuum
The theory published by the Oxford physicist challenges that Hilbert Space is really acting in that manner, and points out the elusiveness of quantum gravity as an indication that this might be the case. He calls his theory “rational quantum mechanics” (RaQM).
“We introduce a theory of quantum physics based on the notion that the continuum nature of quantum mechanics’ state space approximates something inherently discrete, and argue that the reason for such discreteness is gravity.”
The idea is that Hilbert Space is indeed granular, but with extremely small space, as gravity is so weak compared to other fundamental physical forces. He developed these ideas further in a companion scientific paper2 titled “Solving the Mysteries of Quantum Mechanics: Why Nature Abhors a Continuum”.
Without going into the mathematical details, it is considered that the quantum state is only defined with respect to certain “rational” observables. This leads to a slightly different understanding of complex numbers like the imaginary number √(-1) or the so-called quaternions, which allows a realistic interpretation of the quantum state in RaQM, compared with QM.
Or as Pr Palmer puts it, his theory removes some of quantum physics’s famous paradoxes, like Schrödinger’s cat.
“In RaQM, cats are no longer simultaneously alive and dead.”
The 1,000-Qubit Ceiling: Practical Implications for the Future
An essential part of the premise of ultra-powerful quantum computers is that adding more qubits adds more “dimensions” to work on a mathematical problem. This assumption is based on the idea of an infinite “supply of new data storage” (dimensions) by Hilbert Space as more qubits are added to the system.
Pr Palmer’s idea would therefore have serious implications for quantum computers.
If this is true, the information content in the quantum state grows linearly with the number of qubits, and not exponentially as previously thought, essentially breaking the greatest premise of quantum computers.
“Above a critical number of entangled qubits, there simply isn’t enough information in the quantum state to allocate even one bit of information to each dimension of Hilbert space. When this happens, quantum algorithms that utilize all of Hilbert space will stop having a quantum advantage over classical algorithms.”
The paper estimates that this threshold could be hit once quantum computers exceed approximately a few hundred up to 1,000 error-corrected qubits.
It should be noted that this is much below the expected threshold required for breaking important levels of encryption, with, for example, 4,099 qubits needed to break a 2048-bit RSA key using Shor’s algorithm, the quantum algorithm the most likely to be useful for practical purposes.
If Pr. Palmer is right, this could mean that encryption will forever stay safe from quantum computers as we understand them today.
As many quantum computer prototypes are getting close to this limit, alone or through networking, we will likely soon enough know if this idea is true.
“‘QM has met all the experimental challenges thrown at it and so, in the paper, I propose an experiment that could be performed in a few years – if one is to believe the quantum tech roadmaps – for testing RaQM against QM.’”
The concept could as well have some major ramifications for quantum physics, if proven true, much beyond limiting quantum computers’ potential. Which in itself could make quantum computers very important, even if their practical applications are more limited than previously hoped.
“If quantum computers provide the experiments not only to find a successor theory to quantum mechanics, but more importantly to find the theory which synthesises quantum and gravitational physics, that would surely be an extraordinarily good outcome for all the work that has been put into quantum computing over the years.”
Strategic Investment Takeaways: Managing Quantum Risk
This new concept is far from proven, and actually is a radical departure from physicists’ consensus about quantum mechanics. So this is, for now, just a very interesting, but unproven theory that exists only in theoretical mathematics.
It should, however, be paid attention to by investors in quantum computing stocks, as it reminds us that quantum physics is still very much not fully understood yet, and holds potential for both surprising new possibilities and limits in its practical applications.
Another element is that if encryption is permanently safe from quantum computers, so is Bitcoin, which has recently suffered from the narrative of being soon “broken” by progress in quantum computing, a topic we also covered in “The Post-Quantum Investment Audit: Top 10 Stocks for 2026”.
So it could make sense to balance both risks against each other:
- If quantum computers hit a max threshold of 1,000+ qubits, Bitcoin is safe, and the narrative that pushed Bitcoin price downward goes away.
- If Pr Palmer is wrong, quantum computers might indeed threaten the Bitcoin part of a portfolio, but they will also be able to perform a hard-to-imagine marvel of calculation in both encryption and a deeper understanding of the material world.
So a portfolio mixing quantum computing stock and cryptocurrencies will probably best mitigate both eventuality.
For quantum computing investment, you can consult our investment report on Honeywell and its quantum computing subsidiary, Quantinuum, or our article “5 Best Quantum Computing Companies of 2025”.
References:
1. Tim Palmer. Rational quantum mechanics: Testing quantum theory with quantum computers. PNAS. 123 (12) e2523350123. March 16, 2026. https://doi.org/10.1073/pnas.2523350123
2. Tim Palmer. Solving the Mysteries of Quantum Mechanics: Why Nature Abhors a Continuum. Proceedings of the Royal Society. February 18, 2026. https://arxiv.org/abs/2602.16382









