Quantum Computing


Postulates of QM

[IS Security]


[Feedback Form]

Andrew M. Steane, Quantum Computing, Reports on Progress in Physics, vol 61, pp 117-173 (1998).


In 1936, Alan Turning showed that there is a Universal Turning Machine (UTM) that can be used to simulate any other Turning machine. If an algorithm can be executed on any machine, then there is an equivalent algorithm for a UTM which performs the same task. This is the start of the computer science. Later on, John von Neumann developed a theoretical model for how to put together in a pratical fashion all the components necessary for a computer to be fully as capable as a UTM. In 1947, when John Bardeen, Walter Brattain, amd Will Shockley developed the transistor, the computer that constructed from electronic components took off. As important as this, in 1948, Claude Shannon proved two important theorems of information theory: noiseless channel coding theorem and noisy channel coding theorem. The first theorem quantifies how the physical resources required to store the output from an information source. Then the latter quantifies how much information it is possible to reliably transmit through a noisy communications channel. Error-correction codes should be used to protect the information being sent through a noisy environment.

In mid 1970s, Robert Solovay and Volker Strassen showed that it is possible to test whether an integer is prime or composite using a randomized algorithm. Therefore, it is possible that we can use a probabilistic Turing machine to simulate any algorithmic process efficiently. Later on, in 1985, David Deutsch attempted to define a computational device that would be capable of efficiently simulating an arbitrary physical system. This device (or quantum computer (QC)) can effectively solve computational problems that cannot be solved by classical computers. Indeed, QC might have computational powers exceeding those of classical computers.

Two important problems - the problem of finding the prime factors of an integer, and the discrete algorithm problem - could be solved efficiently on a quantum computer. In 1994, Peter Shor demonstrated these results and showed that QC is more powerful than Turning machine. In 1995, Lov Grover showed that another important problem - the problem of conducting a search through unstructured search space - could also be sped up on a QC. At about the same time as Shor's and Grover's algorithms were discovered, many people were developing an idea Richard Feynman had suggested in 1982. Feynman pointed out that basically there were difficulties in simulating quantum mechanical systems on classical computers.

The quantum information theory started to appear in 1995 after Ben Schumacher is an analogue to Shannon's noiseless code theorem. Schumacher defined the 'quantum bit' or 'qubit' as a tangible physical resource. While the theory of quantum error-correcting codes were developed by two groups in 1996 independently: Robert Calderbank and Peter Shor, and Andrew Steane, now known as CSS codes after their initials. The theory was developed to protect quantum states against noise. Further, in 1992, Charles Bennett and Stephen Wiesner explained how to transmit two classical bits of information, while only transmitting one quantum bit from sender to receiver, a result dubbed superdense coding.

Two Physics experiments fundamental to the quantum computation and quantum information.

Stern-Gerlach Experiment

EPR Thought Experiment

Information Theory

Hamming Distance

Noiseless Coding Theorem

Classical and Quantum Information

Maxwell's Demon and Entropy

James Clerk Maxwell proposed the demon in 1871 as a challenge to the validilty of the second law of thermodynamics. During irreversible process, because human cannot control the motions of the microscopic particles, disordering of particles happened. The demon is an intelligent being who can supposedly operate a device to allow the particles passing or not. Suppose that the demon is worked in an isolated chamber consisting of two compartments connected by a gate that can be either opened or closed. At the beginning, one compartment is filled with particles and the other compartment is empty. The demon opens the gate to allow fast particles to pass to the empty compartment and the slow ones to remain in the original. As the time go by compartment with the fast particles would become hotter and the original would become cooler. In another word, heat is transferred from a lower temperature to a higher temperature without any external work. The action of demon would cause a transition from disorder to order in the gas, that is, a decrease in entropy [consider a thought experiment] . This process violates second law of thermodynamics.

The connection between entropy and information can be applied to the problem of Maxwell's demon. Suppose that you are called upon to guess a person's telephone number. In Hong Kong, the telephone number is in 8 digit. So the chance for you to say it right is one over ten power of eight. However, if I give you some hints (information / answer that I know) and tell you the first 7 digit. Then your chance will be increased to one over ten. So it is clear that the fewer the choice or particular state of a system may be achieved, the greater is the information. A convenient measure of the information I conveyed when the number of choices is reduced from to is given by

Since is the entropy, then

The more the information, the more the reduction of entropy. The entropy of the system is reduced by the amount of information about the state of the system.

Zurek [4] started from the Church-Turing thesis proposed that the net ability of demons to extract useful work from systems depends on the sum of measures of two distinct aspects of disorder: (1) Statistical Entropy and (2) Algorithmic Information Content. The Church-Turing thesis involved the assumption that the intellectual abilities of Maxwell's demons can be regarded as equivalent to those of a universal Turning machine.

Statistical Entropy

where is the density matrix of the system, determines the ignorance of the observer.

Algorithmic Information Content

is given by the size, in bits, of the shortest algorithm p* which, for an "operating system" of a Maxwell's demon, can reproduce the detailed description of the state of the system. It quantifies the cost of storing of the acquired information, which is related to the randomness inherent in the state of the system revealed by the measurement.

Physical Entropy

A Thought Experiment

Consider the following thought experiment (microcanonical approach) done one a computer with journaling file system (jfs):

  1. Assume there is an algorithm to determine the location of the particles. From which, according to the speed of the particles, to determine the temperature of the compartments. There are two compartments. One for holding the high-speed particles and one for the slow particles.
  2. The program runs and the system evolve. The new configuration of the system is based on the previous ones and the information is stored in a file.
  3. The jfs records every changes of the output file so that we can restore the system to any state we want.
  4. This algorithm and program takes the role of Maxwell's demon. The intelligent demon can execute programs to reconstruct (reversible) records of past measurements, or to carry out logical operations in optimizing performance.
  5. In order to satisfy the thermodynamics (for the discussion of quantum computing later) we may also make the following assumptions:
  6. The demon's (program) can restore the system to previous states and restart new configuration if the temperature (calculated or physical) of the compartments did not meet the requirements, i.e. one become hotter and one become cooler.

If for the entropy of the whole univers (system) is always increasing. Then there are some parameters needed to determine:

  1. The minimum energy (in the nature) to write one bit of information (this may not related to the electrical energy of driving the hard disk).
  2. The amount of energy required to erase the bit of information.
  3. A shortest algorithm for the demon to operate, including the compression and compaction of information in the jfs.
  4. The change of entropy of the jfs during the execution.
  5. The effect of decoherencing when in quantum computing

Time and Space

When I was a child I fascinated by the Einstein's famous equation E = MC^2. After graduated for ten years, today I am still do not quite understand what is energy. The energy is abstract but is also tangible and has mass. From above if information is entropy then it has energy. So there is mass. Consider if we are to describe the motion of this "mass" by Hamiltonian then we can state it in terms of Poisson brackets. Because Poisson brackets are canonical invariants. So it implies that equations expressed in terms of Poisson brackets are invariant in form under canonical transformation. And canonical transformations that are analytical functions of continuous parameters form groups (Lie groups). For example, a system with spherical symmetry is invariant under rotation about any axis (SU(2) group). Therefore, shall we conclude that information is observable only in the "symmetrical" space (note that each possible measurement on a system corresponds to a resolution of its Hilbert Space into orthogonal subspaces). The "subspace" (the term borrowed from the sci-fi Star Trek, inter-galactic communication channels used by the Star Fleet) which does not possess the required symmetric properties should carry no measurable information.

If a wave function carries information.

Quantum Time

Another topic which is also related is the Heisenberg uncertainty principle.
There are two well known pairs of uncertainty:

  1. p-x momentum-position
  2. E-T enery-time
They have their origin in the compatibility properties of the operators that corresponding to the observable being measured. One of the very interesting thing to note that is we have momentum, position (Dirac delta function), and energy (Hermitian) operators, but no time operator exist!. The commutator property is very clear in the p-x relation but how about the E-T. Consider if energy has information so it can be expressed in terms of entropy. The mathematical construction of entropy also makes it possible to be an operator. If so, by the uncertainty principle, the time period of measurement is inversely proportional to the change of entropy. The smaller the entropy change the larger the duration of measuring time.

Quantum Computation

Boolean Algebra

A Boolean algebra is a closed algebraic system containing a set B of two or more elements and two operations: (OR) . and (AND) +. Has the following properties:

0 is called the identity element for the OR and 1 is called the identity element for the AND. The complement is referred to as the NOT or negation operation. The AND operation is also referred to as conjunction and OR operation is as disjunction.

DeMorgan's Theorem describes the relationships between the operation + (OR), . (AND) and negation.


For Boolean functions there exist universal sets of operations with only one element. The NAND and NOR operations can be used to build any other function. The physical implementation of a Boolean function is carried out by interconnection of electronic gates. The NAND gate is an AND gate followed by an inverter. A NAND gate can have two or more inputs. The NOR gate is an OR gate followed by an inverter. The NOR gate can also have two or more inputs.

0 0 1
0 1 1
1 0 1
1 1 0
0 0 1
0 1 0
1 0 0
1 1 0
0 0 0
0 1 1
1 0 1
1 1 0
0 0 1
0 1 0
1 0 0
1 1 1
CMOS 4011 CMOS 4001B CMOS 4030 CMOS 4077

The exclusive-NOR (or XNOR) gate produces a 1 output only when the inputs are at the same logic level. The XNOR is also known as the even-parity function. The exclusive-OR (or XOR) gate produces a 1 output only when either input is 1, but not when both are 1. XOR is also called the odd-parity function, and is the basis of error-handling circuits. This function can also interpreted as (numerical) summation modulo 2. The XOR gate is widely used in digital circuits that perform mathematical functions.

Quantum Gates

Quantum Algorithms

Quantum Information Theory

Von Neumann Entropy

Measures of Entanglement

Quantum Coding

  1. Michael A. Nielsen, Isaac L. Chuang, Quantum Computation and Quantum Information (2000), Cambridge University Press.
  2. Feynman and Computation, edited by Anthony J. G. Hey (1999), Perseus Books.
  3. Mark W. Zemansky and Richard H. Dittman, Heat and Thermodynamics (1997), McGraw Hill
  4. W. H. Zurek, Algorithmic Randomness, Physical Entropy, Measurements, and the Demon of Choice from [2] of above.
  5. Yorick Hardy and Willi-Hans Steeb, Classical and Quantum Computing (2000), Birkhauser Verlag