Quantum Computing
Quantum computing studies information processing in systems whose states are vectors in complex Hilbert space and whose allowed operations are physical quantum evolutions, measurements, and controlled noise processes. This area sits between quantum mechanics, linear algebra, cryptography, and computer architecture: the central question is not whether quantum theory is strange, but which computational tasks become different when superposition, interference, entanglement, and measurement are engineered deliberately.
The foundational pages in this section now use Michael A. Nielsen and Isaac L. Chuang's Quantum Computation and Quantum Information as the primary textbook reference, synthesized with the wiki's earlier draft material. The current split is practical: hardware asks how qubits are physically made; algorithms asks where coherent quantum evolution changes complexity; quantum error correction asks how fragile states can be protected; and quantum machine learning asks which learning or optimization workflows may benefit from quantum subroutines without overstating speculative claims.
![]()
Figure: Bloch sphere representation of a single qubit state. Image: Wikimedia Commons, Glosser.ca, CC BY-SA 3.0.
Primary Source
Nielsen and Chuang is the canonical source for this area's notation and core results: Dirac notation and density operators, quantum circuits, QFT and phase estimation, Shor and Grover, physical implementation criteria, quantum operations, distance measures, stabilizer codes, fault tolerance, entropy, and quantum information theory. QML is marked as modern supplementary material because it is not treated directly as a standalone field in N&C; it borrows the book's notation for states, channels, measurements, and information measures.
Definitions
A qubit is a normalized vector in a two-dimensional complex Hilbert space:
An -qubit pure state belongs to a -dimensional tensor-product space. The basis states are bit strings for , and a general pure state is
A quantum gate is usually modeled as a unitary matrix with . A quantum circuit is a sequence of such gates, followed by measurements in some basis. Real devices also include state preparation, calibration, crosstalk, leakage outside the computational subspace, control pulses, readout errors, and classical feedback.
A measurement converts quantum amplitudes into classical outcomes. For a projective computational-basis measurement, the probability of observing bit string is , and the post-measurement state collapses to the observed subspace. General measurement is described by positive operator-valued measures, but the circuit model often compiles final readout into computational-basis measurements plus preceding unitaries.
An oracle is a black-box unitary used to define query complexity. Many early algorithms, including Deutsch-Jozsa, Bernstein-Vazirani, Simon's algorithm, and Grover search, are clearest in the oracle model. An oracle result is not automatically an end-to-end application speedup because real workloads also need data loading, oracle construction, fault-tolerant compilation, and error budgets.
A NISQ device is a noisy intermediate-scale quantum processor: it has enough qubits to be scientifically interesting, but not enough low-error logical qubits for large fault-tolerant algorithms. A fault-tolerant quantum computer uses quantum error correction so that encoded logical operations fail much less often than physical operations.
The main pages in this area are:
| Page | Main question | Typical neighbors |
|---|---|---|
| Hardware | What physical systems implement qubits and gates? | Quantum mechanics, controls, materials, cryogenics, optics |
| Algorithms | Which computational problems admit quantum speedups? | Complexity, number theory, linear algebra |
| Quantum Error Correction | How can noisy qubits form reliable logical qubits? | Coding theory, stabilizers, many-body physics |
| Quantum Machine Learning | How do parametrized circuits, kernels, and quantum subroutines interact with learning? | Machine learning, optimization, kernels |
Key results
The first structural result is that quantum information is linear. If an algorithm maps basis states by a unitary , then it also maps superpositions by linearity:
This is the source of both power and constraint. A circuit can evaluate branches coherently, but measurement only samples from the final distribution. Useful algorithms arrange interference so that wrong answers cancel and right answers receive amplified probability.
The second structural result is that arbitrary quantum states cannot be copied. The no-cloning theorem follows from linearity. If a unitary copied two arbitrary states,
then inner products would be preserved by unitarity:
This only holds for identical or orthogonal states. Since arbitrary quantum states can have any inner product, universal copying is impossible. This fact links computing to quantum communication and quantum security.
The third structural result is that entanglement changes representation cost. A product state has the form
but a generic -qubit state requires complex amplitudes. This exponential state space is not itself an exponential computer, because amplitudes are not directly readable. It becomes useful only when a circuit has a problem-specific interference pattern.
The fourth structural result is computational: known exponential speedups, such as Shor's factoring algorithm, use algebraic structure and phase estimation; quadratic speedups, such as Grover search, use amplitude amplification; and many NISQ methods are heuristic rather than complexity-theoretic. This is why this wiki separates the mature algorithmic core from the more experimental quantum machine learning literature.
Finally, scalable quantum computing appears to require error correction. Physical qubits decohere and gates are imperfect. The threshold theorem states, roughly, that if physical noise is sufficiently weak and local, arbitrarily long quantum computations can be performed with polylogarithmic overhead by encoding information and using fault-tolerant operations. The theorem is a statement about asymptotic possibility, not a promise that any particular hardware platform is already practical.
Visual
The diagram is bidirectional because progress is coupled. Algorithms determine the required logical gates; error correction turns logical requirements into physical overhead; hardware determines which native gates, connectivity, and noise models are realistic; and QML often lives at the boundary where hardware constraints and classical optimization meet.
Worked example 1: Classifying a quantum-computing claim
Problem. A paper claims: "Our quantum processor samples from a distribution that is hard for classical computers; therefore quantum computers can now break RSA." Decide which part of the quantum-computing map this claim belongs to, and whether the conclusion follows.
Method.
- Identify the task. Sampling from a hard distribution is an algorithms-and-complexity claim, usually related to quantum advantage experiments.
- Identify the output type. A sampling experiment produces bit strings from a distribution. It does not necessarily solve a decision, search, optimization, or factoring problem.
- Compare with the RSA threat. Breaking RSA by Shor's algorithm requires coherent modular exponentiation, quantum phase estimation, enough logical qubits, and fault-tolerant gates.
- Check whether the hardware statement includes error correction. If the processor is NISQ, the experiment may be scientifically important but does not imply large-scale fault-tolerant factoring.
- State the conclusion conservatively.
Answer. The sampling result belongs mainly to algorithms and hardware. It may demonstrate a separation for a specialized sampling problem under assumptions, but it does not by itself imply that RSA can be broken. The RSA conclusion would require the resource path in quantum error correction: physical qubits to logical qubits, logical gates to modular exponentiation, and phase estimation with a low total failure probability. The checked answer is: the first claim may be about quantum sampling advantage; the RSA conclusion does not follow.
Worked example 2: Estimating state-vector storage
Problem. Estimate the classical memory required to store an arbitrary pure state of qubits using double-precision complex amplitudes. Assume each complex amplitude uses two 64-bit floating-point numbers.
Method.
- Count amplitudes. An -qubit state has complex amplitudes.
- Count bytes per amplitude. A 64-bit float is bytes, so a complex number with real and imaginary parts uses
- Multiply:
- Convert to gibibytes:
Answer. A dense state-vector simulation of qubits requires about GiB just for the state vector. This does not include temporary arrays, gate matrices, memory alignment, or simulation overhead. The result checks the intuition that each additional qubit doubles the state-vector memory.
Code
The following pure-Python snippet builds a small dependency graph for this section and prints a topological reading order for one possible learning path. It is not a simulator; it is a compact way to keep the conceptual prerequisites explicit.
from collections import defaultdict, deque
prerequisites = {
"hardware": ["qubits", "measurement"],
"algorithms": ["qubits", "unitaries", "measurement"],
"error correction": ["qubits", "measurement", "pauli operators"],
"quantum machine learning": ["algorithms", "hardware", "optimization"],
"fault tolerance": ["error correction", "hardware"],
}
graph = defaultdict(list)
indegree = defaultdict(int)
for topic, deps in prerequisites.items():
indegree.setdefault(topic, 0)
for dep in deps:
graph[dep].append(topic)
indegree[topic] += 1
indegree.setdefault(dep, 0)
queue = deque(sorted(node for node, deg in indegree.items() if deg == 0))
order = []
while queue:
node = queue.popleft()
order.append(node)
for child in sorted(graph[node]):
indegree[child] -= 1
if indegree[child] == 0:
queue.append(child)
print("Suggested dependency order:")
for i, topic in enumerate(order, start=1):
print(f"{i}. {topic}")
Common pitfalls
- Treating a large Hilbert space as automatically useful computation. Useful quantum algorithms need an interference pattern and a readout strategy.
- Confusing a qubit count with computational capability. Coherence, gate fidelity, connectivity, measurement quality, and error-correction overhead matter.
- Assuming all quantum speedups are exponential. Grover search is quadratic, many quantum-walk results are problem-specific, and many NISQ methods are heuristic.
- Ignoring data loading. A claimed quantum advantage can disappear if preparing the input quantum state costs as much as solving the classical problem.
- Equating "quantum advantage" with broad practical usefulness. A demonstration can be scientifically valid while solving a narrow sampling task.
- Treating QML claims as established because the circuit is quantum. Generalization, trainability, noise, and classical baselines must be checked.
Connections
- Quantum hardware gives the physical constraints behind qubits, gates, and readout.
- Quantum algorithms explains Shor, Grover, phase estimation, HHL, and oracle algorithms.
- Quantum error correction explains how logical qubits are built from noisy physical qubits.
- Quantum machine learning covers variational circuits, quantum kernels, QAOA, and trainability.
- Quantum communication uses the same measurement and no-cloning principles for protocols.
- Quantum internet connects entanglement distribution, repeaters, teleportation, and networked processors.
- Quantum security studies QKD, quantum-safe cryptography, and security implications.
- Cryptography is the classical neighbor most directly affected by Shor's algorithm.
- Linear algebra supplies the language of vector spaces, tensor products, unitary matrices, and eigenvectors.
Further reading
- Michael A. Nielsen and Isaac L. Chuang, Quantum Computation and Quantum Information.
- John Preskill, Lecture Notes for Physics 219/Computer Science 219: Quantum Computation.
- Eleanor Rieffel and Wolfgang Polak, Quantum Computing: A Gentle Introduction.
- Phillip Kaye, Raymond Laflamme, and Michele Mosca, An Introduction to Quantum Computing.
- Scott Aaronson, Quantum Computing Since Democritus.