Qubits behaving badly

Qubits are the guild’s unruly apprentices, the sort who swear blind they followed the instructions whilst the lab bench is still smouldering. They exist in superposition, which is a polite way of saying they refuse to pick a lane and would prefer to be everywhere and nowhere at once. This is sold as a feature, much like calling a dragon “thermally expressive”.

Entanglement binds them in invisible arguments. Tug one qubit and its partner, three streets away and pretending not to be involved, will yelp in perfect synchrony. Naturally, no one can explain why this works without waving their hands in a way that suggests either deep insight or a desperate hope that you will stop asking questions.

Then comes decoherence, the inevitable hangover. One stray vibration, a drifting magnetic field, or an engineer sneezing too assertively, and your beautifully balanced quantum state collapses into a puddle of statistical despair. All your careful preparation vanishes faster than a thief who has just remembered he left the loot on the bar.

Qubits are also spectacularly expensive, maddeningly temperamental, and highly susceptible to cosmic rays, low-flying pigeons, or the occasional falling brick. Managing them is less “engineering” and more “attempting to placate a wizard with a long memory and a short fuse”. And yet the guild insists this is the future of computing. Perhaps it is. Or perhaps it is just another lesson in why you never trust an apprentice who grins when you are not looking.

Superposition, or why qubits cannot make up their minds

Classical bits are straightforward. They are either a zero or a one, on or off, cat alive or cat dead. They have the clarity of a well-made door, which is either open or closed and never embarrassingly both.

Qubits reject such simplicity. A qubit exists in superposition, meaning it is zero and one simultaneously, in proportions that shift and shimmer like a street performer’s promise. Technically this is described as a linear combination of basis states. Practically it means the qubit is hedging its bets harder than a merchant at dice.

Why this is allegedly useful

Superposition allows quantum computers to explore multiple solutions at once. Where a classical computer must check each possibility sequentially, like a watchman inspecting every door on Treacle Mine Road, a quantum computer considers all possibilities simultaneously, like a particularly efficient burglar who exists in several buildings at once until someone notices.

This parallelism is the source of quantum computing’s theoretical advantage. Emphasis on theoretical. Because the moment you try to check what answer your qubits have arrived at, superposition collapses. The qubit picks a state, probably not the one you wanted, and all that lovely parallelism vanishes like a political promise after the election.

The measurement problem

Observing a qubit changes it. This is not philosophy, this is physics behaving like a particularly awkward witness. The act of measurement forces the qubit to choose a state, collapsing the superposition into a definite answer.

This would be fine if you could measure carefully, but you cannot. Measurement is violent. It is like asking someone a delicate question by shouting it whilst they are balanced on a tightrope. You get an answer, but it is probably not the one they would have given under calmer circumstances.

Worse, you only get one measurement. Measure once, qubit collapses, superposition gone. Want to check your answer? Run the entire computation again. Hope you get the same result. Often you do not, because quantum mechanics is a field of study that specialises in making certainty uncomfortable.

Living with superposition

Practical quantum computing requires designing algorithms that make clever use of superposition whilst minimising measurements. This is like trying to conduct an orchestra where the musicians vanish if you look at them too directly. It can be done, but it requires either brilliance or a high tolerance for frustration.

The guild’s approach is to run the computation many times, measure at the end each time, and collate the results statistically. If you are lucky, the right answer appears more often than the wrong ones. If you are unlucky, you spend three weeks debugging a circuit that produces elegant nonsense.

Entanglement, the invisible argument

If superposition is qubits refusing to commit, entanglement is qubits refusing to shut up about each other.

Entangled qubits are bound together in ways that transcend space, time, and common sense. Measure one and you instantly know something about the other, even if it is on the far side of the city, or the disc, or possibly another universe altogether. Einstein called this “spooky action at a distance” and was deeply unhappy about it. The universe did not care about Einstein’s happiness.

How entanglement works, sort of

Two qubits become entangled when their quantum states are prepared such that measuring one immediately determines the state of the other. They do not communicate, exactly. They simply share a quantum state that refuses to be described independently. It is like a pair of dice that always show the same number, even when rolled in different buildings by people who have never met.

This does not allow faster-than-light communication, before you ask. You cannot send messages via entanglement because you cannot control what result the measurement produces. You just know that whatever result you get here, the other qubit will produce the correlated result there. Useful for certain applications. Useless for sending “meet me at the Drum” across town instantaneously.

Why entanglement matters

Entanglement is essential for quantum computing. Many quantum algorithms rely on entangled qubits to perform computations that classical computers cannot efficiently replicate. Entanglement creates correlations that allow quantum states to encode and process information in ways that classical bits simply cannot match.

It is also a massive pain to maintain. Entangled qubits are fragile. Any interaction with the environment, any stray photon or wandering magnetic field, can destroy the entanglement. The qubits decohere, the correlation breaks, and your computation collapses into expensive randomness.

Creating entanglement requires precise control. Gates must be applied with exquisite timing. Qubits must be isolated from environmental noise whilst still allowing controlled interactions. This is quantum engineering at its most demanding, like trying to whisper secrets in a thunderstorm whilst blindfolded.

Entanglement in practice

In theory, entanglement lets you create quantum states that represent complex correlations across many qubits. In practice, you are lucky if you can reliably entangle three qubits without something going wrong.

Current quantum computers can create entanglement across dozens of qubits, but only for brief moments before decoherence destroys it. Maintaining entanglement long enough to complete useful computations is the central challenge of quantum computing. It is the difference between theory and engineering, between “this should work” and “this definitely will not work, but we are trying anyway.”

The guild maintains extensive documentation on entanglement protocols, most of which concludes with some variation of “good luck” or “avoid sneezing.” This is honest, if not encouraging.

Decoherence, the morning after

Qubits do not stay quantum. Given time, or noise, or a slightly disapproving glance from across the room, they decohere. Superposition collapses. Entanglement breaks. Your quantum state, lovingly prepared over minutes or hours, decays into thermal noise.

Decoherence is not a bug. It is physics asserting itself. Quantum states are delicate. They require isolation from the environment. Any interaction, any exchange of energy, any coupling to external degrees of freedom, and the quantum information leaks away like secrets at a guild meeting.

Sources of decoherence

Thermal noise is the obvious culprit. Qubits must be kept extraordinarily cold, typically millikelvins above absolute zero, because thermal energy disrupts quantum states. A single stray photon with sufficient energy can decohere a qubit. Room temperature is a furnace. Even the cosmic microwave background is too hot.

Electromagnetic interference is another enemy. Stray magnetic fields, ambient radio waves, even the electromagnetic signature of nearby electronics, all contribute to decoherence. Quantum computers are shielded extensively, wrapped in layers of protection like paranoid nobility in heavy cloaks. It helps, somewhat.

Vibrations matter too. Mechanical motion couples into quantum systems. Qubits built on superconducting circuits can decohere from vibrations in the infrastructure. Qubits based on trapped ions require vacuum systems with vibration isolation. The universe is full of tiny disturbances that classical computers ignore but quantum computers cannot.

Even cosmic rays contribute. High-energy particles from space slam through your carefully isolated quantum processor and occasionally interact with a qubit. This is rare, but it happens, and when it does your computation fails in a way that is difficult to debug because the cause is literally from space.

Fighting decoherence

The primary defence is isolation. Keep qubits cold. Shield them from electromagnetic interference. Isolate them from vibrations. Operate in vacuum. Build in clean rooms. Do everything possible to minimise environmental coupling.

The second defence is speed. Run your computation faster than decoherence can destroy it. Quantum gates operate on nanosecond timescales. Decoherence might take microseconds or milliseconds. The race is on. Finish your calculation before the quantum state collapses. If you are fast enough, and lucky enough, you can extract useful results before physics ruins everything.

The third defence is quantum error correction, which is an entire field of research dedicated to the premise that if you cannot stop decoherence, you can at least detect and fix its effects. This requires encoding each logical qubit into multiple physical qubits, monitoring for errors, and correcting them on the fly. It works, in theory. In practice it requires so many physical qubits to encode a single logical qubit that current quantum computers cannot implement it meaningfully. This is the great challenge of quantum computing, scaling up whilst keeping decoherence down.

Living with decoherence

Practical quantum computing is a race against decoherence. Every algorithm must complete before the quantum states decay. Every gate must execute reliably despite noise. Every measurement must happen before the information is lost.

This constrains what quantum computers can do. Long computations are not feasible because decoherence will destroy the state before completion. Deep circuits fail because accumulated errors overwhelm the signal. Complex entanglement patterns cannot be maintained because decoherence breaks them faster than they can be used.

Current quantum processors have decoherence times measured in microseconds to milliseconds, depending on the qubit technology. This limits practical computations to tens or hundreds of gates. Beyond that, the noise wins. Error correction might extend this eventually, but today’s quantum computers are fundamentally limited by how fast they can operate before physics loses patience.

The guild’s unofficial motto regarding decoherence is “work quickly and hope.” This is not a robust engineering strategy, but it is honest.

The economics of keeping apprentices in line

Qubits are expensive. Not “a decent meal at The Patrician’s table” expensive. Monumentally expensive. The sort of expensive that makes accountants reach for calming beverages.

Hardware costs

Building a quantum computer requires infrastructure that would make a Klatchian prince wince. Dilution refrigerators to achieve millikelvin temperatures cost hundreds of thousands of euros each. The supporting electronics, the shielding, the vibration isolation, the vacuum systems, all add up rapidly. A modest quantum processor might require millions of euros in hardware before you run a single gate.

Then there is maintenance. Cryogenic systems require constant upkeep. Helium is expensive and must be replenished. Components fail. Calibrations drift. Quantum computers are not appliances you install and forget. They are temperamental research instruments that demand constant attention from specialists.

The largest quantum computers, the ones with hundreds of qubits that make headlines, represent investments of tens to hundreds of millions of euros. This includes the hardware, the facilities, the engineering teams, the researchers, and the years of development required to coax the machinery into producing useful results.

Operational costs

Even if you have built a quantum computer, operating it is expensive. Power consumption for the cryogenic systems is substantial. Helium costs accumulate. Staff salaries for the quantum engineers, physicists, and software developers required to keep the system running are not trivial.

Cloud quantum computing services exist, allowing organisations to rent time on someone else’s expensive quantum hardware. This is cheaper than building your own, but not cheap. Rates vary, but expect to pay euros per hour for access to quantum processors, with actual computation time measured in milliseconds. If your algorithm requires many runs to produce statistically valid results, costs accumulate rapidly.

When qubits are worth the cost

For certain problems, quantum computers offer genuine advantages that justify the expense. Simulating quantum systems, for instance, is something quantum computers can do efficiently where classical computers struggle. Drug discovery, materials science, and cryptography research have applications where quantum advantage is real.

For most applications, however, quantum computers are vastly more expensive than classical alternatives. Running machine learning on quantum hardware costs orders of magnitude more than running it on GPUs or CPUs, whilst offering little to no performance advantage for most tasks. Unless your problem is specifically suited to quantum computation, classical hardware is cheaper, faster, more reliable, and far more practical.

The guild’s purchasing decisions reflect this. Quantum hardware is acquired for research, for exploring possibilities, and for gaining expertise. It is not acquired because the finance director calculated return on investment and decided quantum made economic sense. Those calculations consistently favour classical hardware for nearly everything.

The apprentice diversity problem

Not all qubits are created equal. Different physical implementations, different characteristics, different advantages, different headaches.

Superconducting qubits

The most common approach. Qubits built from superconducting circuits operating at millikelvin temperatures. These are the qubits used by IBM, Google, and others in their quantum processors.

Advantages: Fast gate operations (nanoseconds). Reasonably controllable. Fabricated using adapted semiconductor manufacturing techniques. Scalable, in theory.

Disadvantages: Requires extreme cooling (millikelvins). Decoherence times measured in microseconds to milliseconds. Sensitive to electromagnetic noise. Limited connectivity between qubits without complex routing.

Best suited for: Research. Near-term quantum algorithms. Benchmarking. Experiencing existential dread about decoherence.

Trapped ion qubits

Qubits encoded in individual ions held in electromagnetic traps. Used by IonQ, Honeywell, and academic researchers.

Advantages: Long decoherence times (seconds). High-fidelity gates. All-to-all connectivity (any qubit can interact with any other). More forgiving of errors than superconducting qubits.

Disadvantages: Slower gate operations (microseconds). Difficult to scale to many qubits. Vacuum systems required. Laser systems complex and expensive.

Best suited for: High-fidelity quantum circuits. Algorithms requiring many gates. Pretending qubits are obedient.

Topological qubits

Theoretical qubits that encode information in topological states of matter, making them inherently resistant to decoherence. Microsoft’s approach.

Advantages: Should be far more stable than other qubits. Error correction requirements reduced. Would revolutionise quantum computing if they worked.

Disadvantages: Do not exist yet as practical qubits. Theoretical understanding is incomplete. Experimental progress has been slower than hoped. Might not work at all.

Best suited for: Academic papers. Hoping for breakthroughs. Explaining to investors why progress is slower than promised.

Photonic qubits

Qubits encoded in photons, manipulated via optical circuits. Used by Xanadu, PsiQuantum, and others.

Advantages: Operate at room temperature (photons do not decohere in the same way). Can leverage existing photonics manufacturing. Potentially scalable.

Disadvantages: Difficult to create entanglement between photons. Photon loss is a major issue. Requires probabilistic gates, making circuits complex. Two-qubit gates are slow and inefficient.

Best suited for: Quantum communication. Certain quantum sampling algorithms. Niche applications where photonics advantages outweigh limitations.

Neutral atom qubits

Qubits encoded in neutral atoms held in optical tweezers. Emerging approach pursued by several startups and research groups.

Advantages: Long decoherence times. Flexible qubit positioning. All-to-all connectivity possible. Scalable array architectures.

Disadvantages: Slower gates than superconducting qubits. Requires complex laser systems. Still early in development.

Best suited for: Medium-term quantum computing. Applications requiring many qubits with flexible connectivity.

Choosing apprentices wisely

No qubit technology is clearly superior. Each has trade-offs. Superconducting qubits are fast but fragile. Trapped ions are stable but slow. Topological qubits are theoretically perfect but do not exist. Photonic qubits avoid some problems whilst creating others. Neutral atoms show promise but remain developmental.

For practical quantum computing today, superconducting qubits and trapped ions dominate. They are the technologies with the most mature hardware, the most available systems, and the most active development. If you are accessing quantum computers via cloud services, you are probably using one of these two approaches.

The guild maintains working relationships with multiple quantum hardware providers, hedging bets on which technology will ultimately scale. This is prudent, because nobody knows which approach will win. Possibly none will. Possibly all will find niches. The future of quantum computing is unclear, and the apprentices are not talking.

The reality of wrangling quantum chaos

Managing qubits in practice is an exercise in careful frustration. You calibrate. You configure. You run your circuit. Something goes wrong. You recalibrate. You try again. Different thing goes wrong. You adjust. You persist. Eventually you get results, or you give up and try a different approach.

Calibration is never-ending

Quantum processors require constant calibration. Gates drift. Qubit frequencies shift. Readout fidelities change. Every day, sometimes multiple times per day, parameters must be recalibrated to maintain performance.

This is not fully automated. Engineers monitor calibration metrics. When something looks wrong, they investigate. They adjust control pulse parameters. They rerun calibration routines. They check for hardware issues. This is skilled work, requiring expertise in quantum control and knowledge of the specific hardware.

For cloud quantum computing users, someone else handles calibration. For organisations running their own quantum hardware, calibration is a significant operational burden. It is maintenance that never ends, like painting a very expensive bridge that also requires a physics degree to understand.

Error rates matter profoundly

Classical computers have error rates so low they are effectively perfect. Quantum computers do not. Gate error rates are typically 0.1% to 1%, meaning one in every hundred to one in every thousand operations fails. For shallow circuits this is manageable. For deep circuits with hundreds of gates, errors accumulate until the output is garbage.

This limits what quantum computers can currently do. Only algorithms with circuits shallow enough that errors do not overwhelm the signal are feasible. Deep circuits require error correction, which requires far more qubits than currently available. Until error rates improve dramatically, or error correction becomes practical, quantum computing is constrained to relatively simple computations.

The guild tracks error rates meticulously. Every algorithm is designed with error budgets in mind. How many gates can we afford? How deep can the circuit be before noise dominates? These are the questions that determine feasibility. Often the answer is “shallower than we would like.”

The debugging experience

Debugging quantum circuits is unlike debugging classical code. You cannot step through execution. You cannot inspect quantum states directly without destroying them. You cannot log intermediate values without collapsing superposition.

Instead, you run the circuit many times. You collect statistics. You infer what is happening based on output distributions. You compare expected results to observed results. If they match, your circuit probably works. If they do not, something is wrong, and now you must figure out what.

Often the problem is hardware. A gate is miscalibrated. A qubit has drifted out of spec. Crosstalk between qubits is introducing errors. These hardware issues are opaque to the programmer. You see incorrect results, but determining the cause requires access to hardware diagnostics and expertise in quantum control.

Sometimes the problem is algorithmic. Your circuit does not do what you thought. Your quantum gates do not compose the way you expected. Your understanding of quantum mechanics was incomplete. These problems are your fault, but they are difficult to diagnose because you cannot easily inspect what is happening.

The guild’s debugging methodology is “run it many times, compare results, sacrifice a small offering to the gods of statistical mechanics, try again.” This is not a joke. This is actual procedure.

Surviving the quantum apprenticeship

If you must work with qubits?

Manage expectations ruthlessly

Quantum computing will not solve all your problems. It will not revolutionise your business overnight. It will not make machine learning suddenly effortless. It is a research technology with limited current applications, high costs, and significant challenges.

Set expectations accordingly. Quantum is for exploration, learning, and preparing for eventual practical utility. It is not for production deployment of critical systems. Anyone promising otherwise is either misinformed or selling something.

Start classically

Before touching quantum hardware, simulate quantum circuits classically. Simulators are free, always available, and do not suffer from hardware noise. They are also slower and limited to fewer qubits, but for learning and initial development they are invaluable.

Only move to real quantum hardware when classical simulation becomes infeasible or when you specifically need to study real hardware behaviour. Quantum hardware time is expensive and limited. Do not waste it on experiments that could be done classically.

Embrace shallow circuits

Design algorithms with minimal circuit depth. Fewer gates means fewer opportunities for errors. Shallow circuits are more likely to produce useful results on noisy hardware.

This severely constrains what you can do, but it is the reality of current quantum computers. Work within the constraints. Save deep circuits for future quantum processors with error correction.

Document everything

Quantum computing is sufficiently complex and unfamiliar that future you will not remember why present you made certain decisions. Document your reasoning. Record your parameters. Note your observations.

This documentation serves two purposes. First, it helps you debug when things inevitably go wrong. Second, it helps colleagues who must understand your work. Quantum expertise is rare. Make it easier for others to follow your reasoning.

Collaborate with experts

Unless you are yourself a quantum physicist or quantum engineer, you will need expert guidance. Quantum computing straddles physics, computer science, and engineering in ways that require specialised knowledge.

Find collaborators who understand quantum mechanics, quantum control, and quantum algorithms. Learn from them. Ask questions. Build expertise gradually. Quantum computing is not something one masters through weekend tutorials. It requires sustained study and hands-on experience.

Accept failure gracefully

Many quantum experiments fail. Gates do not work as expected. Circuits produce noise. Hardware misbehaves. Results are inconclusive. This is normal.

Do not take it personally. Quantum computing is difficult. Failure is part of the process. Learn from failures. Iterate. Try different approaches. Progress is slow and frustrating, but it is progress.

The guild has lost many apprentices to quantum computing. Some burnt out from frustration. Others concluded the technology was not yet ready for practical use and moved to more stable fields. Those who remain have developed patience, persistence, and a high tolerance for uncertainty. These are necessary virtues for anyone working with qubits.

The uncomfortable truth about apprentices

Qubits are marketed as the future of computing. Revolutionary. Transformative. Inevitable.

The reality is more complicated. Qubits are useful for specific problems where quantum advantage is genuine. For most other problems, classical computers are vastly superior in every practical dimension: cost, speed, reliability, accessibility, ease of use.

Quantum computing will not replace classical computing. It will complement it, eventually, for certain applications. This is still valuable, but it is not the revolution the marketing suggests.

The guild’s position is pragmatic. Quantum computing is worth exploring. Research is valuable. Expertise is being built. But production deployments are years away, possibly decades for many applications. Organisations investing in quantum should do so with realistic timelines and modest expectations.

Meanwhile, the apprentices wiggle, decohere, and occasionally produce useful results. They remain expensive, fragile, and difficult to manage. They are the future, perhaps. Or they are an elaborate lesson in why theoretical breakthroughs do not always translate to practical engineering.

Either way, someone must keep the apprentices in line. That duty falls to the quantum engineers, who persist despite the chaos, because occasionally, just occasionally, the qubits cooperate, and something genuinely remarkable happens. Those moments are rare. They are also what makes the frustration worthwhile.

The guild notes these moments carefully, documents them thoroughly, and reminds everyone that quantum computing is a long game. Progress is measured in years and decades, not weeks and months. For those with patience and realistic expectations, quantum computing offers genuine opportunities. For everyone else, best to wait and see how it develops.

The apprentices are not going anywhere. They will continue misbehaving, decohe ring at inconvenient moments, and occasionally astonishing everyone by working exactly as intended. Managing them will remain more art than science, more negotiation than command, more hope than certainty.

Welcome to quantum computing. Mind the qubits. They bite.