overview
The bit, most basic unit of information within a computer, is the building block for all data residing within. An alphanumeric character, for example, usually consumes 1 byte, or 8 bits, of memory. A 2-byte, unsigned integer would be allowed to range from 00000000 00000000 to 11111111 1111111 in binary, or 0-65535 in decimal notation (0-216).
The “bit” of a quantum computer, referred to as qubit (short for quantum bit), might be represented as an atom. Qubits, however, possess an intrinsic and most significant quality: they may simultaneously be assigned 0 or 1, the probability of each expressed as a numerical coefficient. This ability allows quantum computers to exist in multiple states at once, called superposition by those knowledgeable of such things. Subsequently, actions may be carried out in all states simultaneously, allowing parallel operations to be performed with one processing unit. A phenomenon dubbed “entanglement” joins qubits together in a quantum system.
Superpositioning opens the way for several fascinating and potentially problematic uses for quantum computers. Factoring numbers of several hundred digits, a tactic needed to crack some encryption schemes currently in use, would take billions of years on the fastest supercomputers. Theoretically, this might take a year on quantum computers. Similarly daunting mathematical tasks, researchers speculate, will become trivial also.
The difficulty in sustaining a quantum computer lies in its nearly unavoidable contact with an external environment, interactions with which inexorably lead to collapse, termed decoherence, of the system. The use of NMR technology with fluids has produced quantum computers of a couple of bits (see History/Developments), but these systems lose coherence after a few minutes, allowing at most 1000 operations before succumbing to the thermal motion produced by the molecules in the liquid.