Temperamental machinery and cosmic rays¶
Quantum computers are the most temperamental pieces of machinery humanity has constructed since someone decided steam engines were a good idea. They require temperatures colder than interstellar space, isolation from electromagnetic interference, and protection from cosmic radiation that their classical counterparts barely notice. A classical computer can survive being dropped, exposed to coffee spills, and operated in environments that would horrify its designers. A quantum computer experiences existential collapse from stray photons, thermal fluctuations measured in millionths of a kelvin, and the occasional cosmic ray that happens to pass through the laboratory.
Building quantum computers resembles maintaining the Unseen University’s High Energy Magic building, where reality is thin and anything might happen at unfortunate moments. The hardware must be pampered, monitored constantly, and recalibrated whenever any of the thousand things that can go wrong inevitably do go wrong. Engineers speak of coherence times and error rates with the weary resignation of people who’ve learned that quantum mechanics is personally antagonistic toward their career satisfaction.
The gap between quantum computing as imagined by theorists and quantum computing as built by engineers contains all the practical difficulties that make the field simultaneously fascinating and infuriating. Theory says quantum computers should work. Hardware says the universe has opinions about that.
Dilution refrigerators or keeping qubits colder than space¶
Most quantum computers use superconducting qubits, which only exhibit quantum behaviour at temperatures near absolute zero. Not metaphorically near. Actually near. Around 15 millikelvin, which is 0.015 degrees above absolute zero, colder than interstellar space by a factor of two hundred. Achieving these temperatures requires dilution refrigerators, which are exactly as expensive and complicated as they sound.
A dilution refrigerator works by exploiting quantum properties of helium-3 and helium-4 mixtures to reach temperatures below what conventional refrigeration can achieve. The system has multiple cooling stages, each progressively colder, with the quantum processor mounted at the very bottom inside layers of magnetic shielding and thermal isolation. The entire apparatus resembles a chandelier designed by someone who really hates maintenance engineers, all hanging cables and nested cylinders wrapped in gold-plated copper.
Cooling down takes days. The system must be carefully ramped through temperature stages to avoid thermal shock. Once cold, it must stay cold because warming up and cooling down again takes more days and risks damaging the delicate quantum circuits. This means quantum computers run continuously, consuming significant power for refrigeration even when idle, and any maintenance requiring warm-up involves losing days of operational time.
The temperatures involved are so extreme that they require careful consideration of every material in the system. Metals that are excellent conductors at room temperature might become superconductors or develop problematic magnetic properties at millikelvin temperatures. Plastics that are stable at room temperature might outgas contaminants. Even the wiring carrying signals to the qubits must be carefully designed to conduct signals while blocking heat.
Vibrations are also problematic. Any physical disturbance can couple energy into the quantum system, causing decoherence. Dilution refrigerators include vibration isolation, but they can’t eliminate all mechanical disturbances. Laboratories are built with floating floors and isolated from external vibration sources. Even so, nearby construction, heavy traffic, or someone dropping something in an adjacent room can potentially affect the quantum processor.
The operational cost is substantial. Dilution refrigerators consume kilowatts of power continuously. They require liquid helium, which is expensive and sometimes in short supply due to global helium shortages. They need constant monitoring by engineers who understand cryogenics, vacuum systems, and quantum hardware. A single dilution refrigerator costs hundreds of thousands of euros, and that’s before installing the actual quantum processor inside it.
Alternative qubit technologies exist that don’t require such extreme cooling. Ion trap quantum computers operate at room temperature, using electromagnetic fields to trap individual ions and manipulate their quantum states with lasers. Topological qubits, if they work, might operate at higher temperatures. But superconducting qubits currently dominate the field because despite the refrigeration requirements, they’re the technology that’s advanced furthest toward practical quantum computers.
Quantum error correction or shouting over the noise¶
Quantum computers make mistakes constantly. Gate operations are imperfect. Qubits decohere. External noise couples into quantum states. Cosmic rays strike the quantum processor. Without error correction, quantum computations accumulate errors until the result is meaningless garbage, which happens after a few thousand operations at current error rates.
Quantum error correction addresses this by encoding each logical qubit in many physical qubits, spreading the quantum information redundantly so that errors can be detected and corrected without measuring the logical qubit directly. The most studied approach is the surface code, which arranges physical qubits in a two-dimensional grid and performs regular syndrome measurements to detect errors.
The challenge is that quantum error correction requires substantial overhead. Each logical qubit might need hundreds or thousands of physical qubits depending on the desired error rate. Performing error correction itself requires quantum operations that introduce their own errors, so you need physical qubits with error rates below a threshold before error correction helps rather than making things worse.
Current superconducting qubits have gate error rates around 0.1 percent to 1 percent, which is below the threshold where surface code error correction works theoretically. Practically, you still need many physical qubits per logical qubit and must perform error correction continuously throughout the computation. Google’s latest quantum processors have around 100 qubits. After implementing error correction, you might get one or two logical qubits. Maybe.
The error correction community is working toward better codes, better qubit connectivity, and improved error rates. Progress is real, but the overhead remains substantial. Practical quantum computers that can run useful algorithms will need thousands of logical qubits, which means millions of physical qubits with error correction, which is considerably more than the dozens or hundreds of physical qubits in current systems.
Ion trap quantum computers have better gate fidelities, sometimes above 99.9 percent, but scaling to large numbers of ions poses its own challenges. Topological qubits promise built-in error protection through their physical structure, but building topological qubits has proven difficult. Superconducting qubits scale more easily but have worse error rates. Every approach involves trade-offs, none have solved the error correction problem completely, and substantial work remains.
The practical implication is that current quantum computers are noisy intermediate-scale quantum devices. They’re too small and error-prone for error correction to help meaningfully, so algorithms must be designed to work despite noise. This limits the depth of quantum circuits that produce useful results and restricts quantum computers to problems where approximate, noisy answers are acceptable.
Topological qubits or theoretically better, practically theoretical¶
Topological qubits are quantum computing’s version of fusion power: always promising, perpetually just a few years away, and potentially revolutionary if they ever actually work at scale. The theory is beautiful. Instead of encoding quantum information in properties like electron spin or superconducting phase that are susceptible to local noise, topological qubits encode information in global properties of exotic quantum states that are inherently protected from local disturbances.
The most prominent approach uses anyons, which are quasiparticles that exist in certain two-dimensional quantum systems and have unusual statistics that make them suitable for quantum computation. Braiding these anyons around each other performs quantum gates, and the information is protected because local perturbations can’t change the global topology of the braiding pattern. In principle, topological qubits should have much lower error rates than conventional qubits without requiring extensive error correction overhead.
Microsoft has invested heavily in topological quantum computing, building research teams and developing software frameworks for eventual topological quantum computers. The challenge is that creating and manipulating anyons requires exotic materials in extremely controlled conditions. The leading candidate is Majorana zero modes in semiconductor-superconductor hybrid structures, which have proven extraordinarily difficult to create and characterise reliably.
After years of effort, the field has produced some evidence of Majorana signatures in experiments, but also some retracted papers when results couldn’t be reproduced or turned out to have alternative explanations. Creating a single topological qubit reliably remains challenging. Scaling to many topological qubits performing fault-tolerant quantum computation remains a distant goal. The timeline keeps sliding forward as each technical milestone proves harder than expected.
The theoretical advantages are real. If topological qubits work as hoped, they could dramatically reduce the overhead for error correction and enable practical quantum computers sooner than other approaches. The uncertainty is whether the engineering challenges can be overcome and whether the performance advantages materialise in real hardware rather than theoretical calculations.
Other quantum computing companies have largely stopped waiting for topological qubits and proceeded with superconducting or ion trap approaches that work now, even if they require error correction. Microsoft remains committed to the topological approach but has also started developing software that works with other quantum hardware in case topological qubits take longer than hoped. The field collectively shrugs and continues working on multiple approaches because nobody knows for certain which will succeed.
Cloud quantum services or rent time on someone else’s delicate machinery¶
Building and maintaining a quantum computer requires millions of euros in capital investment, cleanroom facilities, teams of specialised engineers, and ongoing operational costs for cryogenics and maintenance. Most researchers and businesses cannot justify this investment, so cloud quantum computing services have emerged allowing users to rent time on quantum processors operated by companies with deeper pockets and higher tolerance for cryogenic engineering.
IBM Quantum provides cloud access to their superconducting quantum processors, ranging from small educational systems to their largest research processors. Users can run quantum circuits through a web interface or programmatic API, with results returned once the computation completes. Access to small systems is free for research and education. Access to larger systems requires formal collaboration or payment. The systems are shared among many users, so jobs queue and may take hours or days to execute depending on demand.
Amazon Braket provides access to quantum computers from multiple vendors including IonQ, Rigetti, and the D-Wave quantum annealers. Users pay per task execution based on the quantum hardware used and number of shots requested. The service integrates with Amazon’s cloud infrastructure, allowing users to combine quantum computations with classical cloud computing for hybrid algorithms. This is convenient but expensive, with costs accumulating quickly for extensive experimentation.
Google Cloud offers access to their quantum processors, though with more restricted availability than IBM. Microsoft Azure Quantum provides access to IonQ and other quantum hardware along with Microsoft’s quantum development tools. These services target enterprises and researchers willing to pay for cloud resources while exploring quantum computing.
The advantages are obvious. No capital investment, no maintenance burden, no need to hire cryogenic engineers or quantum physicists for your IT department. You can experiment with quantum algorithms, test quantum software, and understand quantum computing without building quantum computers. For research groups, startups, and businesses exploring quantum computing, cloud access is the practical approach.
The disadvantages include limited access to the most advanced systems, queuing delays when demand is high, pricing that can become expensive for extensive use, and restricted ability to customise hardware or access low-level controls. You’re using shared infrastructure maintained by others, which means you get what they provide rather than what you might ideally want.
Cloud quantum services also mean trusting the provider with your quantum algorithms and data. For sensitive applications, sending your computations to external quantum systems raises security and intellectual property concerns. Quantum cloud providers implement security measures, but you’re still executing code on hardware you don’t control, with measurement results passing through infrastructure operated by third parties.
The practical reality is that cloud quantum services democratise access to quantum computers while shifting the operational burden to specialised providers. This is probably necessary for the field’s growth because the number of organisations willing to operate dilution refrigerators is rather smaller than the number interested in quantum computing. Whether this model persists as quantum computers mature or whether quantum hardware eventually becomes deployable on-site like classical servers remains uncertain.
Why a falling brick or cosmic ray can ruin your day¶
Quantum computers are vulnerable to environmental disturbances that classical computers barely notice. A cosmic ray passing through the processor can flip qubits by depositing energy into the quantum system. Electromagnetic interference from nearby electronics can couple into quantum circuits. Mechanical vibrations can disturb the dilution refrigerator. Even the Earth’s magnetic field, if not properly shielded, can affect quantum states.
Cosmic rays are high-energy particles from space, mostly protons, that strike Earth constantly. When they hit the atmosphere, they create cascades of secondary particles that reach ground level. Most of these particles pass harmlessly through ordinary matter. Quantum computers, operating with millikelvin temperatures and delicate quantum states, are considerably less indifferent to ionising radiation.
A single high-energy particle passing through a quantum processor can deposit enough energy to flip qubits, cause decoherence, or introduce errors into quantum computations. The probability is low for any individual cosmic ray strike, but quantum computers operate for extended periods and cosmic rays are numerous. Over hours of operation, the probability of cosmic ray induced errors becomes non-negligible.
Classical computers experience similar issues with cosmic rays causing bit flips in memory, which is why servers use error-correcting memory. The error rates are low enough that personal computers usually don’t bother with protection and just accept occasional random errors. Quantum computers have higher error rates from other sources, and cosmic rays add to the error budget in ways that are difficult to predict or mitigate.
Electromagnetic interference is more controllable but still problematic. The quantum processor sits inside layers of electromagnetic shielding, but signals must pass in and out for control and measurement. Every cable is a potential antenna picking up interference. Every electronic component near the quantum system is a potential noise source. Radio transmitters, mobile phones, and even electric motors in nearby equipment can couple noise into quantum systems if shielding is imperfect.
Mechanical vibrations couple energy into the dilution refrigerator and quantum processor. Building construction, heavy vehicles passing nearby, or even someone walking heavily in the laboratory can introduce vibrations. Laboratories housing quantum computers are designed with vibration isolation, but they cannot eliminate all disturbances. An earthquake, even a small one, would be catastrophic for quantum coherence.
The cumulative effect is that quantum computers require carefully controlled environments, constant monitoring, and acceptance that despite all precautions, random errors will occur from sources you cannot completely eliminate. Running a quantum computer means operating in a regime where fundamental physics is personally antagonistic and the universe keeps finding creative ways to introduce noise into your carefully prepared quantum states.
This environmental sensitivity is not a temporary engineering problem that better design will eliminate. It’s fundamental to quantum mechanics that quantum states are fragile and easily disturbed. Error correction can compensate, but only if error rates are low enough and you have sufficient qubits for redundancy. Until then, quantum computers remain exquisitely sensitive devices that require substantial effort to coax into producing useful results while cosmic rays, thermal fluctuations, and electromagnetic noise conspire against coherence.
The engineers maintaining quantum computers develop philosophical attitudes toward these challenges. They calibrate meticulously, they shield carefully, they monitor continuously, and they accept that quantum decoherence will eventually win. The goal is to extract useful computation before that happens, then reset the system and try again. It’s less triumphant than quantum supremacy press releases suggest, but considerably more honest about the actual experience of operating quantum hardware in a universe that wasn’t designed for human convenience or quantum computing ambitions.