Quantum Computers Are Like Kaleidoscopes, Helping Illustrate Science and Technology

Quantum computing is like Forrest Gump’s box of chocolates: You never know what you’re gonna get. Quantum phenomena – the behavior of matter and energy at the atomic and subatomic levels – are not definite, one thing or another. They are opaque clouds of possibility or, more precisely, probabilities. When someone observes a quantum system, it loses its quantum-ness and “collapses” into a definite state.

Quantum phenomena are mysterious and often counterintuitive. This makes quantum computing difficult to understand. People naturally reach for the familiar to attempt to explain the unfamiliar, and for quantum computing this usually means using traditional binary computing as a metaphor. But explaining quantum computing this way leads to major conceptual confusion, because at a base level the two are entirely different animals.

This problem highlights the often mistaken belief that common metaphors are more useful than exotic ones when explaining new technologies. Sometimes the opposite approach is more useful. The freshness of the metaphor should match the novelty of the discovery.

The uniqueness of quantum computers calls for an unusual metaphor. As a communications researcher who studies technology, I believe that quantum computers can be better understood as kaleidoscopes.

Digital Certainty vs. Quantum Probabilities

The gap between understanding classical and quantum computers is a wide chasm. Classical computers store and process information via transistors, which are electronic devices that take binary, deterministic states: one or zero, yes or no. Quantum computers, in contrast, handle information probabilistically at the atomic and subatomic levels.

Classical computers use the flow of electricity to sequentially open and close gates to record or manipulate information. Information flows through circuits, triggering actions through a series of switches that record information as ones and zeros. Using binary math, bits are the foundation of all things digital, from the apps on your phone to the account records at your bank and the Wi-Fi signals bouncing around your home.

In contrast, quantum computers use changes in the quantum states of atoms, ions, electrons or photons. Quantum computers link, or entangle, multiple quantum particles so that changes to one affect all the others. They then introduce interference patterns, like multiple stones tossed into a pond at the same time. Some waves combine to create higher peaks, while some waves and troughs combine to cancel each other out. Carefully calibrated interference patterns guide the quantum computer toward the solution of a problem.

[embedded content]

Physicist Katie Mack explains quantum probability.

Achieving A Quantum Leap, Conceptually

The term “bit” is a metaphor. The word suggests that during calculations, a computer can break up large values into tiny ones – bits of information – which electronic devices such as transistors can more easily process.

Using metaphors like this has a cost, though. They are not perfect. Metaphors are incomplete comparisons that transfer knowledge from something people know well to something they are working to understand. The bit metaphor ignores that the binary method does not deal with many types of different bits at once, as common sense might suggest. Instead, all bits are the same.

The smallest unit of a quantum computer is called the quantum bit, or qubit. But transferring the bit metaphor to quantum computing is even less adequate than using it for classical computing. Transferring a metaphor from one use to another blunts its effect.

The prevalent explanation of quantum computing is that while classical computers can store or process only a zero or one in a transistor or other computational unit, quantum computers supposedly store and handle both zero and one and other values in between at the same time through the process of superposition.

Superposition, however, does not store one or zero or any other number simultaneously. There is only an expectation that the values might be zero or one at the end of the computation. This quantum probability is the polar opposite of the binary method of storing information.

Driven by quantum science’s uncertainty principle, the probability that a qubit stores a one or zero is like Schroedinger’s cat, which can be either dead or alive, depending on when you observe it. But the two different values do not exist simultaneously during superposition. They exist only as probabilities, and an observer cannot determine when or how frequently those values existed before the observation ended the superposition.

Leaving behind these challenges to using traditional binary computing metaphors means embracing new metaphors to explain quantum computing.

Peering Into Kaleidoscopes

The kaleidoscope metaphor is particularly apt to explain quantum processes. Kaleidoscopes can create infinitely diverse yet orderly patterns using a limited number of colored glass beads, mirror-dividing walls and light. Rotating the kaleidoscope enhances the effect, generating an infinitely variable spectacle of fleeting colors and shapes.

The shapes not only change but can’t be reversed. If you turn the kaleidoscope in the opposite direction, the imagery will generally remain the same, but the exact composition of each shape or even their structures will vary as the beads randomly mingle with each other. In other words, while the beads, light and mirrors could replicate some patterns shown before, these are never absolutely the same.

[embedded content]

If you don’t have a kaleidoscope handy, this video is a good substitute.


Using the kaleidoscope metaphor, the solution a quantum computer provides – the final pattern – depends on when you stop the computing process. Quantum computing isn’t about guessing the state of any given particle but using mathematical models of how the interaction among many particles in various states creates patterns, called quantum correlations.

Each final pattern is the answer to a problem posed to the quantum computer, and what you get in a quantum computing operation is a probability that a certain configuration will result.

New Metaphors For New Worlds

Metaphors make the unknown manageable, approachable and discoverable. Approximating the meaning of a surprising object or phenomenon by extending an existing metaphor is a method that is as old as calling the edge of an ax its “bit” and its flat end its “butt.” The two metaphors take something we understand from everyday life very well, applying it to a technology that needs a specialized explanation of what it does. Calling the cutting edge of an ax a “bit” suggestively indicates what it does, adding the nuance that it changes the object it is applied to. When an ax shapes or splits a piece of wood, it takes a “bite” from it.

Metaphors, however, do much more than provide convenient labels and explanations of new processes. The words people use to describe new concepts change over time, expanding and taking on a life of their own.

When encountering dramatically different ideas, technologies or scientific phenomena, it’s important to use fresh and striking terms as windows to open the mind and increase understanding. Scientists and engineers seeking to explain new concepts would do well to seek out originality and master metaphors – in other words, to think about words the way poets do.


Sorin Adam Matei is an Associate Dean for Research at Purdue University. This article is republished from The Conversation under a Creative Commons license. Read the original article.

Source : Discovermagazine