NextFin News - A fundamental barrier has emerged in the race to build powerful quantum computers, as new research reveals that these systems effectively "forget" their earlier operations as they attempt to perform more complex tasks. The study, published in Nature Physics on April 6, 2026, demonstrates that quantum noise—the environmental interference that plagues subatomic calculations—does not just create errors; it actively erases the influence of earlier steps in a circuit, rendering deep quantum architectures no more effective than shallow ones.
The research was led by Armando Angrisani and Yihui Quek at the École Polytechnique Fédérale de Lausanne (EPFL), alongside collaborators from the Free University of Berlin and the University of Copenhagen. By analyzing large-scale quantum circuits built from two-qubit operations, the team found that in the presence of realistic noise levels, only the final few layers of a circuit significantly impact the computational outcome. This "forgetting" effect means that as a quantum circuit grows longer, the information processed in its initial stages gradually fades away, much like a chain of unsteady dominoes where only the last few pieces determine the final position.
This discovery strikes at the heart of the "quantum advantage" narrative that has driven billions of dollars in venture capital and government subsidies into the sector. The prevailing industry logic has been that increasing "circuit depth"—the number of sequential operations a computer can perform—would unlock the ability to solve problems that are impossible for classical supercomputers. However, the EPFL-led study suggests that current hardware may be hitting a ceiling where adding more steps provides zero marginal utility. For investors and tech giants like IBM, Google, and IonQ, this shifts the focus from scaling the number of qubits to the far more difficult task of radical noise reduction and error correction.
The findings also provide a sobering explanation for why some noisy quantum circuits appear "trainable" in machine learning applications. While researchers have noted that these systems can still be adjusted to produce specific results, the study indicates this is often because the noise has already simplified the circuit's complexity. In effect, the computer is not performing a deep, sophisticated calculation; it is behaving like a much simpler, shallow system that is easier to manipulate but lacks the transformative power originally promised by quantum theory.
From a competitive standpoint, this research creates a clear divide between "noisy intermediate-scale quantum" (NISQ) devices and the theoretical goal of fault-tolerant quantum computing. While NISQ devices were once seen as a viable bridge to commercial utility, the "forgetting" effect suggests that bridge may be shorter than anticipated. Companies that have bet heavily on near-term applications in chemistry or optimization may find their hardware incapable of reaching the necessary depth to outperform classical algorithms, which are themselves becoming more efficient at simulating noisy quantum systems.
However, the study is not an obituary for the industry. Instead, it serves as a technical roadmap, highlighting that the path to commercial viability lies in high-fidelity gates and active error correction rather than raw circuit length. While the "forgetting" effect limits what can be achieved today, it also clarifies the mathematical boundaries that engineers must overcome. The immediate impact will likely be a more cautious valuation of quantum startups that claim near-term breakthroughs without demonstrating a clear path to mitigating this noise-induced amnesia.
Explore more exclusive insights at nextfin.ai.
