A digestible introduction to how quantum computers work and why they are essential in the evolution of AI and ML systems. Gain a simple understanding of the quantum principles that drive these machines.
Quantum computing is a rapidly accelerating field with the power to revolutionize artificial intelligence (AI) and machine learning (ML). As the demand for bigger, better, and more accurate AI and ML accelerates, standard computers will be pushed to the limits of their capabilities. Rooted in parallelization and capable of managing much more complex algorithms, quantum computers will be the key to unlocking the next generation of AI and ML models. This article aims to demystify how quantum computers work by breaking down some of the key principles that enable quantum computing.
A quantum computer is a machine that can perform many tasks in parallel, giving it incredible power to solve very complex problems very quickly. Although traditional computers will continue to serve the daily needs of the average person, the fast processing capabilities of quantum computers have the potential to revolutionize many industries far beyond what is possible with traditional computing tools. With the ability to run millions of simulations simultaneously, quantum computing could be applied to,
- Chemical and biological engineering: Complex simulation capabilities can allow scientists to discover and test new drugs and resources without the time, risk, and expense of laboratory experiments.
- Financial investment: Market fluctuations are incredibly difficult to predict as they are influenced by a large number of compounding factors. Nearly endless possibilities could be modeled by a quantum computer, allowing for greater complexity and better precision than a standard machine.
- Operations and manufacturing: a given process can have thousands of interdependent steps, making optimization problems in manufacturing cumbersome. With so many permutations of possibilities, a great deal of computation is needed to simulate manufacturing processes, and assumptions are often required to minimize the range of possibilities to fit within computational limits. The inherent parallelism of quantum computers would allow unrestricted simulations and unlock an unprecedented level of optimization in manufacturing.
Quantum computers are based on the concept of superposition. In quantum mechanics, superposition is the idea of existing in multiple states simultaneously. A condition of superposition is that it cannot be observed directly since the observation itself forces the system to adopt a singular state. While in superposition, there is a certain probability of observing any given state.
Intuitive understanding of overlap
In 1935, in a letter to Albert Einstein, the physicist Erwin Schrödinger shared a thought experiment that epitomized the idea of superposition. In this thought experiment, Schrödinger describes a cat that has been sealed in a container with a radioactive atom that has a 50/50 chance of decaying and emitting a lethal amount of radiation. Schrödinger explained that until an observer opens the box and looks inside, there is an equal probability that the cat is alive or dead. Before opening the box an observation is made, one can think that the cat exists in both life and been dead simultaneously. The act of opening the box and seeing the cat is what forces him to assume a singular state of death. either alive.
Experimental understanding of overlap
A more tangible experiment showing overlap was performed by Thomas Young in 1801, although the implication of overlap was not understood until much later. In this experiment, a beam of light was aimed at a screen with two slits. The expectation was that, for each slit, a ray of light would appear on a board placed behind the screen. However, Young observed several peaks of intensified light and valleys of minimized light instead of just the two points of light. This pattern allowed Young to conclude that the photons must be acting like waves as they pass through the slits in the screen. He came to this conclusion because he knew that when two waves intersect, if both peak, they add up and the resulting unified wave intensifies (producing the dots of light). Conversely, when two waves are in opposite positions, they cancel (producing the dark troughs).
While this conclusion of wave-particle duality persisted, as technology evolved, so did the significance of this experiment. The scientists found that even if a single photon is emitted at a time, the wave pattern appears on the backboard. This means that the single particle passes through both slits and acts like two intersecting waves. However, when the photon hits the board and is measured, it appears as a single photon. The act of measuring the location of the photon has forced it to come together as a single state instead of existing in the multiple states it was in when traversing the screen. This experiment illustrates the overlap.
Application of superposition to quantum computers
Standard computers work by manipulating binary digits (bits), which are stored in one of two states, 0 and 1. In contrast, a quantum computer is encoded with quantum bits (qubits). Qubits can exist in superposition, so instead of being limited to 0 or 1, they are both a 0 and a 1 and many combinations of some 1 and some 0 states. This superposition of states allows quantum computers to process millions of parallel algorithms.
Qubits are usually built with subatomic particles like photons and electrons, which the double-slit experiment confirmed can exist in superposition. Scientists force these subatomic particles to overlap using lasers or microwave beams.
john davidson explains the advantage of using qubits instead of bits with a simple example. Because everything on a standard computer is made up of 0s and 1s, when you run a simulation on a standard machine, the machine iterates through different sequences of 0s and 1s (ie, it compares 00000001 to 10000001). Since a qubit exists as 0 and 1, there is no need to try different combinations. Instead, a single simulation will consist of all possible combinations of 0 and 1 simultaneously. This inherent parallelism allows quantum computers to process millions of calculations at the same time.
In quantum mechanics, the concept of entanglement describes the tendency of quantum particles to interact with each other and become entangled in such a way that they can no longer be described in isolation, since the state of one particle is influenced by the state of the other. When two particles become entangled, their states are dependent regardless of their proximity to each other. If the state of a qubit changes, the state of the paired qubit also changes instantly. In amazement, Einstein described this distance-independent association as “spooky action at a distance.”
Because observation of a quantum particle forces it into a solitary state, scientists have seen that if one particle in an entangled pair has an upward spin, the associated particle will have an opposite downward spin. While it is not yet fully understood how or why this happens, the implications have been powerful for quantum computing.
In quantum computing, scientists take advantage of this phenomenon. Spatially designed algorithms work on entangled qubits to dramatically speed up computations. In a standard computer, adding a bit adds processing power linearly. So if you double the bits, you double the processing power. In a quantum computer, adding qubits increases processing power exponentially. So adding a qubit dramatically increases computational power.
While entanglement brings a great advantage to quantum computing, its practical application presents a great challenge. As discussed, the observation of a quantum particle forces it to adopt a specific state instead of continuing to exist in superposition. In a quantum system, any external disturbance (temperature change, vibration, light, etc.) can be considered as an “observation” that forces a quantum particle to assume a specific state. As the particles become increasingly entangled and state dependent, they are especially prone to external disturbances affecting the system. This is because a disturbance only needs to affect one qubit to have a spiral effect on many more entangled qubits. When a qubit is forced to a 0 or 1 state, it loses the information contained in the superposition, causing an error before the algorithm can complete. This challenge, called decoherence, has prevented quantum computers from being used today. Decoherence is measured as an error rate.
Certain physical error reduction techniques have been used to minimize disturbances from the outside world, including keeping quantum computers in subzero temperatures and in vacuum environments, but so far they have failed to make a significant difference in quantum error rates. Scientists have also been exploring error-correcting code to correct errors without affecting the information. While Google recently implemented a bug fix code That resulted in historically low error rates, the information loss is still too high for quantum computers to be used in practice. Error reduction is currently the main focus for physicists, as it is the most important barrier in practical quantum computing.
Although more work is needed to bring quantum computers to life, it is clear that great opportunities exist to harness quantum computing to implement highly complex AI and ML models to improve a variety of industries.
Happy learning!
Sources
Overlap: https://scienceexchange.caltech.edu/topics/quantum-science-explained/quantum-superposition
Tangle: https://quantum-computing.ibm.com/composer/docs/iqx/guide/entanglement
Quantum computers: https://builtin.com/hardware/quantum-computing