A Quantuum Future - Computing and Cryptography

By Wil McCarthy

Quantum computers: The secret is out

Cryptography, or "hidden writing," is the art and science of hiding information, and typically involves burying messages--ordinary numbers and text--in a sequence of characters which is apparently random, yet easily translated by the intended recipients. Applications for cryptography range from secret military transmissions to subscriber-based satellite TV to secure credit card transactions online. A simple example is the old grammar-school "add one to each letter" code, where A becomes B, B becomes C, and Z becomes A again. Really advanced crypto-kids may even add three or six or negative eleven to their letters, making them that much harder to figure out. These schemes, called "simple substitution ciphers," are notoriously insecure, and can be cracked with a pencil and paper or a plastic decoder ring from a cereal box.

Still, a closely related "poly-substitution cipher" involving a typewriter-like machine with rotating typewheels was employed by the German, Italian, and Japanese militaries during World War II. This scheme was considered unbreakable at the time, because the rotation of the wheels (in essence, the number to add to each letter in the character set) changed with every keystroke, according to a complex formula based on an alphanumeric "key" which changed every day. But complexity and randomness are not the same thing; through a combination of brilliant mathematics and mechanized trial-and-error, the British and American governments were able to deduce the functioning of the "Enigma" device and read most--though by no means all--of the enemy's encrypted radio traffic.

Part of this information-age victory can be attributed to primitive digital computers, storing data as sequences of binary numerals--zeroes or ones. Today, these numerals are called "bits", and can be strung together to represent virtually any type of information. The "key" to an encrypted message is normally a string of numbers and letters, but in principle it could be anything from a picture to a DNA sequence to a musical phrase. Regardless of what the key is, it can always be represented as a sequence of bits. What's more, if the encryption formula (or "algorithm") and key size are known or can be artfully deduced, then a computer can crack the code by exhaustion, simply trying all possible combinations of ones and zeroes until it finds a key which spits out an intelligible message.

18 billion seconds

Unfortunately (or perhaps fortunately), the 64-bit keys in common use today represent 264, or over 18 billion billion combinations, meaning your typical 1000 megahertz (1 GHz) desktop computer would need 18 billion seconds--over 580 years--to reach a guaranteed solution. So if you really, really want to read encrypted messages, the only feasible workaround is massive parallelism. Six hundred 1GHz computers working together could crack that same message in less than a year, and a program on the scale of the SETI@home screen saver project could crack it in just under three hours. As the Axis powers found to their woe during World War II, cryptography does not and can not guarantee privacy against a smart and determined adversary. Still, barring secret "backdoor" keys left in place by sneaky programmers or nervous government agencies, the impracticality of getting two million of your closest friends together means that the vast majority of 64-bit encrypted messages are going to be read by their intended recipients and no one else.

But all that may be about to change.

The invention of the digital computer is generally credited to Charles Babbage, a British Royal Navy contractor hired in 1832 to devise a machine--the "difference engine"--for the calculation of logarithm tables. But taking inspiration from the ingeniously programmable Jacquard weaving machine of his day, Babbage eventually drew up a generalized computing device called the "analytical engine." And while the sorely unamused British government declined to fund this nutty idea, later generations have verified that the machine, as designed, does in fact work, and Babbage's prescient use of punch cards for information storage was widely adopted in the mid 20th century. Moreover, his separation of process (software) from input and output (data) is the underlying principle for all of today's computer designs.

But all across the world, researchers in the field of "quantum computing" are attempting to stuff these two genies back into the same bottle again. While a normal, digital bit can hold only a one or a zero, a "quantum bit" or qbit instead holds a "superposition of states" which can be treated as a zero or a one or, rather mysteriously, both at the same time. As physicist Erwin Schrödinger observed in the 1920s and 30s, events and conditions are, in a physical sense, not determined until observed. His famous "Schrödinger's Cat" thought experiment concludes that in the eyes of the universe, a boxed feline is--literally--neither alive nor dead until some definitive test or measurement--an observation--is performed to determine it one way or the other. This surreal notion has led many scientists back East, to oriental philosophies which espouse an "observer-created reality" in which matter, force, and time are seen as direct manifestations of human consciousness, rather than the other way around.

The unobserved catbox

This viewpoint does have practical applications, in that it can be poured into a bucket and used to wash hogs. Schrödinger's use of the word "observation" reflects the fact that on a subatomic scale, where light doesn't operate, particles can only be detected through collision with other particles. Like billiards smacked with a cue ball, these "observed" particles see their behavior permanently and dramatically altered by the experience. Whether witnessed by human consciousness or not, such particles will bombard and disturb and "observe" the atoms of Schrödinger's cat by the trillions of trillions, unless its catbox has been specifically--one might almost say magically--designed to prevent this.

Which is exactly how quantum bits work: by temporarily excluding the flow of light, matter, and information across their boundaries, permitting the "unobserved" contents to remain in a physically indeterminate state until certain conditions are met and the qbit value "collapses" into a definite one or zero (e.g., the "up" or "down" spin direction of a single, isolated atom).

However nutty this may sound, it is solidly factual; qbits were first demonstrated in 1995 by the National Institute of Standards and Technology (NIST) in Boulder, and by the California Institute of Technology in Pasadena. And while one qbit can only collapse into two possible states, five qbits together can collapse into 25, or 32 different states. And before the collapse, the qbits, in a very literal sense, can both store and perform computations on all 32 states, a feat which a binary computer would need 32 sets of 5, or 160 bits in all--plus calculating hardware--to match. As of 2000, this principle has also been demonstrated in the real world; by researchers from IBM and Stanford University who have built and tested an actual 5-qbit computer.

To put it mildly and briefly, qbits provide the massive parallelism which code breakers seek. A 64-qbit computer could crack that "secure" 64-bit-key encrypted message in a single operation, and could crack today's top military codes in the merest fraction of a second. Moreover, the performance gains of quantum computing are exponential; while a 64-qbit computer is 18 billion billion times as powerful as a 64-bit binary one, a 100-qbit computer is 68 billion times faster still. It's difficult even to imagine what a million qbits (1Mqb) could accomplish, but it's a safe bet that the world's most secret information, if stored or transmitted electronically, would be about as secure as the Cap'N Crunch ciphers of our youth.

Making new friends from scratch

There are, of course, other computing problems which are currently unsolvable, or nearly so, but which would benefit enormously from the power and parallelism of quantum computing. Among these are the study of prime numbers (also important in cryptography), and the simulation of nuclear explosions, supernovae, and clean, peace-oriented fusion reactions. Also, in biotechnology, the accurate prediction of protein folding, to determine the final three-dimensional shape a given amino acid sequence will eventually settle into. Still other applications include the simulation of chaotic systems like weather, ecology, and maybe even human societies. Asimov's Foundation, anyone?

And while progress in artificial intelligence has so far been disappointingly slow, world chess champion Deep Blue shows us that even the crudest analytical techniques can yield "intelligent"--even brilliant--results if enough computing power is thrown behind them. Indeed, our own brains rely on this principle for their daily operation. So with analytical engines a googolplex faster than those in use today (and a billion googolplexes faster than those of Charles Babbage), we may finally discover the true killer-app of computing: making new friends, from scratch.

Hosted by Webfaction

Return to Top

Page rendered with rest2web the Site Builder

Last edited Sun Oct 01 20:09:56 2006.