main    |    back to a bit o' quantum theory    |    on to visual amusements
 
 

history/developments
 
 

1965

Physicist Richard Feynman, involved deeply in the development of the first atomic bomb, proposes significant theories of quantumelectrodynamics, a realm concerned with the way in which electrons interact with one another through the electromagnetic force propogated through the photon.  Creating Nobel-winning, simple visual depictions of the possible interactions between an electron and photon and other atomic interactions, Feynman also predicts that antiparticles, particles which possess a charge opposite to that of their mirror particle, are actually just normal particles which move backwards in time.


1980

Feynman, among others, begins to investigate the generalization of conventional information science concepts to quantum physical processes, considering the representation of binary numbers in relation to the quantum states of two-state quantum systems: in other words, simulating quantum systems not with conventional computers but with other quantum systems constructed for this purpose.


1985

David Deutsch, of Oxford, publishes a theoretical paper describing a universal quantum computer, proving that if two-state system could be made to evolve by means of a set of simple operations, any such evolution could be produced, and made to simulate any physical system; these operations come to be called quantum 'gates', as they function similarly to binary logic gates in classical computers.


1994 - Shor’s Algorithm

Peter Shor, working for AT&T, proposes a method using entanglement of qubits and
superposition to find the prime factors of an integer, a rather valuable process as many encryption systems exploit the difficulty in finding factors of large numbers.  In principle, his algorithm would far surpass the efficiency of any known computer when executed on a quantum computer; Shor’s discovery proves quite instrumental in provoking a storm of research both by physicists and computer scientists.


1995

The National Institute of Standards and Technology and the California Institute of Technology jointly contemplate the problem of shielding a quantum system from environmental influences  and perform experiments with magnetic fields, which allow particles (ions) to be trapped and cooled to a quantum state.  This method, however, allows only devices of a few bits to be created, ones which lose coherence rapidly.
1996 – present
A team composed of University of California at Berkeley, MIT, Harvard University, and IBM researchers pursue a somewhat similar technique, but using nuclear magnetic resonance (NMR), a technology which seems to manipulate quantum information in liquids.  They attempt to ameliorate the threat of decoherence by working with a vast number of quantum computers, allowing each qubit to be represented by many, many molecules, thus decreasing the effect of external forces.  NMR acts on quantum particles in the atomic nuclei of the fluid by creating a certain “spin;” the alignment of a given particle’s spin betrays its value, 0 or 1.  By varying the electromagnetic field used, certain oscillations are found which allow certain spins to flip between these states, allowing them to exist in both at once. Also, the constant motion of molecules in liquids create interactions allowing the construction of logic gates through NMR, the basic units of computation.  The team develops a 2-bit quantum computer made from a thimble of chloroform; input consists of radio frequency pulses into the liquid containing, in essence, the compiled program to be executed.

The algorithm run through the quantum computer is one devised by Lov Grover of Bell Laboratories.   In computer science, one often describes the effiency of an algorithm by the time it takes to complete given the magnitude of input.   If a matrix of N by N elements is input, for example, an exhaustive search of the matrix for a particular value will take N2 tries to find, in the worst case.  If N values are input, the most common search method (called a binary search), will take N/2 tries, on average.  Grover’s quantum algorithm is O(N1/2 ).  With the quantum computer developed, a list of four items was subjected to this algorithm, which proved to be able to find the desired item in a single step.


1998

In 1993, the feasibility of quantum teleportation is proposed by an international team of researchers, who based their conclusions on a theorem of quantum mechanics called the Einstein-Podolsky-Rosen effect.  The theorem describes how two particles which come into contact become “entangled,” and part of the same quantum system.  The group theorizes that two entangled, “transporter” particles introduced to a third, “message” particle might transfer properties from one to the other.  The idea is actually put into practice nearly six years later, by researchers at the University of Innsbruck in Austria.  Two pairs of entangled photons were exposed to each other, and it is revealed that the polarization state of one may be transferred to the other.  The discovery possesses implications for data transfer and networking among quantum particles in quantum computing.


main    |    back to a bit o' quantum theory    |    on to visual amusements