Quantum Computers: From Lab to Reality – Why the New BB Code Is a Giant Leap Forward

How LDPC codes and bivariate bicycle codes emerged as the answer to quantum computing’s main obstacle – and why we’ll need far fewer qubits to correct errors 💡🔄💻


This is the beginning of a new series on MilovanInnovation – a journey into the world of quantum computers.
Slowly, quantum computing is emerging from the fog and hype and becoming a tangible reality. We take on the ungrateful and complex task of familiarizing our readers with the real state of an area that still hovers like a “tower” between our current capabilities and knowledge and the imagined applications.

Quantum computing promises solutions to problems that classical computers will never be able to solve. From drug design and new materials to fundamental scientific discoveries – the potential is enormous.

But there is one big problem: quantum information is incredibly fragile. Every interaction with the environment, every imperfection of materials, even cosmic rays can destroy delicate quantum states before computation even begins.

That is why error correction has become a key requirement for building a functional, scalable quantum computer.

Why Quantum Error Correction Is Harder Than Classical ⚙️❌

To understand the scale of the challenge, we must first grasp how different the operating conditions are for classical and quantum computers.

Classical computers today are incredibly reliable. Errors in memory and processor operations occur at rates of 1 in a billion (10⁻⁹) or even 1 in a quadrillion (10⁻¹⁵) operations. The reason lies in the simplicity of the basic unit of information: a classical bit has only two values – 0 or 1. An error reduces to a “bit flip” – a rare occurrence that we easily detect and correct with redundant coding.

Quantum computers, on the other hand, operate with error rates of 1 in 100 (10⁻²) to 1 in 1000 (10⁻³) – which is a billion times worse than classical systems. Why?

The answer lies in the nature of the qubit. Unlike a classical bit, a qubit exists in a superposition of states 0 and 1. Its state is not described by a single number, but by amplitude and phase – giving it enormous information‑carrying capacity, but also making it extremely sensitive. An error cannot be reduced to just a “bit flip” – phase flips and more complex combinations are possible. Any unwanted influence from the environment destroys the delicate quantum coherence, erasing information before we even use it.

That is why we need sophisticated error correction – not to fix rare errors, but to enable operation in a world where errors are the rule, not the exception.

The Problem with the Surface Code: Accurate, but Expensive 🧱💸

For many years, the surface code was the dominant approach to quantum error correction. It offers a high error threshold (around 1%), relatively fast decoding algorithms, and compatibility with two‑dimensional square lattices used by today’s quantum processors.

However, there is a price: extremely low encoding efficiency. To protect one logical qubit, you need hundreds of physical qubits. For a surface code with distance d, the net encoding rate is roughly 1/(2d²) – meaning that as the distance (and thus fault tolerance) increases, it quickly becomes impractical.

For 100 logical qubits (which is just the beginning of serious quantum computation), you would need tens of thousands of physical qubits. This huge hardware overhead was long considered unavoidable.

LDPC Codes: A Return to Efficiency 🔁📈

In recent years, attention has turned to LDPC (low‑density parity‑check) codes – a family of codes where each check operator acts on a small number of qubits and each qubit participates in only a few checks.

LDPC codes offer asymptotically constant encoding rates and linear distance – in contrast to the surface code, whose rate tends to zero as distance increases. This means you can achieve the same level of protection with significantly fewer physical qubits.

However, it was previously thought that to outperform the surface code under realistic noise (memory errors, gate errors, state preparation and measurement errors), huge LDPC codes with over 10,000 physical qubits would be needed.

New BB Codes: From Theory to Practice 🚀📊

The latest papers (published 2025/2026) present concrete examples of high‑efficiency LDPC codes with just a few hundred physical qubits that already show outstanding performance. Most are from US teams, but no doubt Chinese labs are working on similar solutions – just without much marketing.

Bivariate bicycle (BB) codes are a generalization of earlier ideas. Their key features:

  • Parameters [[n, k, d]]: For example, the [[144, 12, 12]] code encodes 12 logical qubits into 144 physical qubits (plus 144 ancilla for syndrome measurement – 288 physical qubits total) with distance 12.
  • Net encoding rate r = 1/24 – about 10 times more efficient than a surface code of the same distance.
  • The Tanner graph (which describes the encoding technique) has degree 6 and thickness 2 – meaning it can be decomposed into two planar subgraphs, which is excellent for implementation with superconducting qubits (two layers of coupled resonators).

What Does “Distance” Mean in Quantum Codes and Machine Learning? 📏🤖

Before we look at performance, it’s important to understand what distance is – a concept that appears both in quantum error correction and machine learning, albeit in different contexts.

In coding theory, the distance d denotes the minimum weight of an error that can cause a logical error without being detected. In other words, it is the smallest number of physical qubits an error must hit to “slip through” the protection. A larger distance means the code is more resilient – more physical errors are needed to compromise a single logical qubit.

In machine learning, distance has a related but different meaning. It is a way to mathematically quantify the difference between two data points by computing the “distance” in an abstract high‑dimensional data space. The smaller the distance, the more similar the compared data points. In both cases, distance is a measure of difference – whether between correct and incorrect quantum states or between two data vectors.

Game‑Changing Performance 🎯📉

The table accompanying the work shows dramatic improvements:

[[n, k, d]]Net rate rPseudo‑threshold p₀p_L(10⁻³)p_L(10⁻⁴)
[[144, 12, 12]]1/240.00652×10⁻⁷8×10⁻¹³
[[288, 12, 18]]1/480.00692×10⁻¹²1×10⁻²²

What do these numbers mean?

  • The pseudo‑threshold around 0.65‑0.7% means these codes are practically comparable to the surface code in resistance to physical errors.
  • The logical error rate p_L at a physical error rate of 10⁻³ is 2×10⁻⁷ for the [[144,12,12]] code – dramatically low.
  • Super‑exponential suppression: when the physical error rate drops from 10⁻³ to 10⁻⁴, the logical error rate falls from 10⁻⁷ to 10⁻¹³ – almost 6 orders of magnitude improvement.

Why This Matters for a Practical Quantum Computer 🔧🧩

  1. Fewer physical qubits per logical qubit – instead of hundreds, now tens.
  2. Sharper transition in the near‑threshold regime – even a small hardware improvement yields a huge jump in reliability.
  3. Implementation with only 6 neighbours per qubit – the thickness‑2 Tanner graph allows realisation on two planar surfaces, compatible with superconducting architectures.
  4. Ready for demonstration – with 288 physical qubits, the [[144,12,12]] code is within reach of today’s experimental platforms (e.g., IBM, Google, Quantinuum).

What This Means for the Future of Quantum Computing 🔮🌐

These results suggest that quantum error correction does not have to be prohibitively expensive. Highly efficient codes like BB codes bring us closer to practically useful quantum computers:

  • Scalability: Instead of millions of physical qubits for a few hundred logical qubits, perhaps a few tens of thousands will suffice.
  • Modularity: Each logical qubit can be addressed independently, enabling larger systems to be built.
  • Near‑term feasibility: Codes with a few hundred physical qubits are experimentally feasible within the next few years.

Conclusion: Crossing the Threshold of Practicality 🏁✨

For a long time, the main challenge of quantum computing was thought to be building enough qubits. Now it turns out that the choice of error‑correcting code is equally important.

BB codes show that we can achieve more with less. Instead of fighting a huge hardware overhead, we can design smarter ways to protect quantum information.

This does not mean all problems are solved – far from it. But it does mean that the path to a scalable quantum computer is becoming more realistic and closer than ever.


Question for you: Do you believe that quantum computers with error correction will become practically usable in the next decade? And what do you think – will these advances in error correction open the door to a deeper understanding of consciousness, as Penrose suggests?


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *