The Many Worlds of David Deutsch
Why do some physicists now believe that there are many parallel universes very like our own? And if there are, how will this help us build faster computers?
Quantum mechanics was developed in the early years of the 20th century. Starting with the discovery that energy cannot come in infinitely small amounts but comes in discrete packets or quanta, an elaborate mathematical structure was worked out which successfully predicted a very wide range of physical phenomena. The years went by and more and more experimental evidence piled up to attest to the theory’s extraordinary accuracy. However, the theory was very mathematical, and it was not clear what exactly it meant in physical terms – and thus what its predictive success told us about the physical nature of the universe. All the physical interpretations put forward have been shockingly counterintuitive.
The best known and historically the most influential was the Copenhagen interpretation. It says that there is an inherent duality in nature, called ‘complementarity’, according to which attributes that are classically contradictory (such as being a localised particle or a spread-out wave) can both be part of the makeup of the same physical object, but they can never be observed in the same experiment. Asking which attribute the object has objectively is deemed meaningless: the nature of the measurement determines which property is manifested. The value of the measured quantity (e.g. the specific position) is determined randomly at the moment of observation or interaction with the ‘classical level’. This random change is known as ‘collapsing the wave function’. Einstein argued against this interpretation for thirty years, saying among other things that “God doesn’t play dice”.
The Many-Worlds interpretation introduced by Hugh Everett in 1957 and currently advocated by David Deutsch and others says that there are a large number of parallel universes with greater or lesser similarity to our own. The ‘neighbouring’ universes are ones which differ from our own only in the position of a few particles. Neighbouring universes can’t be detected directly but the particles in them can have an interference effect on the corresponding particles in our own universe, which explains the strange behaviour of particles in interference experiments and, one day, quantum computers. Overall, reality, (the ‘multiverse’) is non-random and independent of observers.
Other interpretations include the Pilot Wave interpretation put forward by the late David Bohm.
Computer circuits have been getting smaller and smaller. It was once assumed that a fundamental limit would be reached when circuits and logic gates were only a few atoms in size, because then quantum mechanical effects would become significant. Instead it has been discovered that it is possible to actually exploit these quantum effects to build computers of a completely new type.
In a traditional computer, circuits within the computer can be at one of two voltage levels – which represent ‘0’ and ‘1’. In a quantum computer, the information could be encoded instead as two states of polarisation of light, or as two energy states of an atom. However, according to quantum mechanics, the atom could exist in a superposition of its two states – in other word it could be in state ‘0’ and state ‘1’ at the same time!
A basic principle underlying quantum computing can be demonstrated by the simple experiment in figure 1. In Fig 1a, photons (particles of light) are fired one at a time at a half silvered mirror. Statistically speaking, half of them pass straight through and are detected by detector A, while the other half are reflected at ninety degrees and hit detector B. So far, so obvious. However consider the slightly more complicated arrangement in Fig 1b. Half the time the photon passes through the half silvered mirror and goes along path X, while the other half it is reflected onto path Y. In either case, though, the photon then hits an ordinary mirror and is deflected onto a second half-silvered mirror afterwards. One might reasonably expect that in each case half the photons would pass through and half be deflected, just as in the earlier experiment, so that equal numbers of photons would be detected by the detectors at C and D. The truth, however, is weirder. All the photons end up at detector C, and none at D. Weirder still, if either path X or path Y is then blocked, the two detectors start to detect equal numbers of photons. Why should blocking either path cause photons to start arriving at D, when they weren’t going there before? The explanation according to quantum mechanics is that the experiment is in two states at once, superimposed. The photon is both going down path X and going down path Y. In ‘many-worlds’ language, the photon on path X has an invisible twin going down path Y in a neighbouring universe which interferes with it.
One piece of information stored in a computer is called a bit; the equivalent in a quantum computer is called a qubit. Quantum computers, as we’ve seen, can be in two superimposed states at once. A ‘register’ of three bits can store any number between 0 and 7 coded in binary; a register of three qubits can store all the numbers up to seven simultaneously, and, more to the point, a subsequent computing operation can be carried out on all those numbers simultaneously. As the registers of qubits get larger, the power of the quantum computer grows exponentially.