Conventional computers have come a long away since their inception. Today, people possess more computational power in their palms than a room could half a decade ago. However, even with the modern supercomputers capable of processing data at petaflops (1000000000000000 flops), a void exists for certain computational problems. And with the transistors approaching the smallest possible dimensions, there is a need for a disparate computing system. Hence, quantum computers.
Quantum physics primarily deals with bizarre and abstruse phenomenon that occur on atomic and sub-atomic levels. But the actual meaning and implications of the science are contentious topics even among physicists. Unsurprisingly, the meaning of the term is not patent to the uninitiated populace. Moreover, even most physicists stand divided when it comes to answering the fundamental questions about quantum theory; perhaps that’s why physicists often have a love-hate relationship with this topic. The pith of this sweet and sour outlook has been succinctly recapitulated by the renowned theoretical physicist, Albert Einstein, who stated that “If it [quantum theory] is correct, it signifies the end of physics as a science.”
Although the majority dubs quantum physics as a recondite subject, it is imperative to dive into this labyrinth to comprehend the science. In order to accomplish this feat, one needs to make peace with three slightly abstruse and counter-intuitive concepts: superposition, interference and entanglement.
So, how do these devices—that run on a highly debated branch of physics—function? All modern devices such as tablets, smartwatches, smart glasses are commanded by digital bits, 0s and 1s, whereas quantum computers rely solely on QUBITS. A qubit is the basic unit of quantum information. On one hand where 0 and 1 represent electrical input signal to operate a transistor in classical computers, the 0> and 1> in quantum computers are indicators of inherent state of a subatomic particle such as electron or photon. It could represent the spin—up or down—of an electron or it could stand for the polarization of a photon —vertical or horizontal. Additionally, a phenomenon known as superposition adds “spin” to a qubit, allowing it to exist in dual-state, i.e. it can be 30% in 0-state and 70% in 1-state. As preposterous as the statements sounds, the bigger challenge—apart from making peace with this reality—was to find a way of quantifying this probability.
The concept of linear superposition and two-state devices sounds apocryphal and fascinating at once. But how do these qubits achieve this seemingly unlikely feat that is virtually absent from our daily lives? We can get a grip on the topic by scrutinizing noise-cancellation headphones. These devices implement the concept of destructive interference to cancel out unwarranted ambient noise. First, the microphones capture the “noise” from the surroundings. Then, the speakers produce a sound wave that has the identical frequency and amplitude as ambient noise but is 180o out-of-phase (crests line-up with the troughs). This phenomenon of “cancelling-out” waves is called destructive interference.
This idea of interference is vital in determining the probability of qubit holding a state. The amplitude of the resultant waves—after superposition—is a direct indicator of the probability of finding a particle in that state.Now how do we measure these states at once? More importantly, how do we measure states for a more complex, 64-bit, system? Not to mention that this means measuring 18446744073709551616 bits at once. In short, we cannot. The challenge with superposition is that when we initiate measurement, the particle, which was previously in trillions of probable states, collapses to one of these states. Thus, we can only measure one state at a time.
This is where quantum physics becomes abstract, if it wasn’t so already. The concept is so abstract that even Albert Einstein was lost for words to describe it and thereby termed it as “spooky action”. Quantum entanglement surfaces when particles are engendered in such a way that the state of one particle is contingent on the other. The most astonishing property of entanglement is that these contingencies are just as prevalent at any arbitrary distances through space; wormholes are classical examples of this. Much to Einstein’s dismay, whose theory of relativity proscribed faster than light travel, this correlation of quantum states meant instantaneous transfer of information faster than Einstein’s limit.
The phenomenon of quantum entanglement, unearthed in 1935, solicited a plethora of theories to describe this iconoclastic interaction particles. One such hypothesis was the “hidden variables” theory that posited that the particles—upon creation—surreptitiously agree to what the state of one would be with respect to other upon measurement. Although this theory eliminated the compulsion for information transmission at rates faster than the speed of light , it could not endure Bell’s Inequality theorem. In short, Bell’s inequality statistically proved that measuring the spin of the particles different directions yielded random outcomes. Not only did the test debunk the notion of a “hidden agreement” among particles but also eliminated the probability of instantaneous transmission.
Although the internet is rife with speculative theories to engender “deterministic” entanglement, in reality these are unsubstantial. So, if we can only measure one out of a trillion states at a time and cannot understand entanglement, does this mean that all this fuss regarding quantum computing is futile? No, because manipulation provides an escape from all dicey situations. This means that we need to leverage interference in such a way that all the “incorrect” solutions cancel-out, leaving only correct solutions. And as far as entanglement goes, scientists aim to manipulate entanglement to create bespoke encryption keys (discussed in Part II).
Now that we have an idea of the underpinning mechanisms of a quantum computer, let’s take a look at some of its key functional aspects. First we have the chips, which are commendable science units themselves. A quantum chip or qubit is a superconducting chip structured with silicone beneath a metal layer. The structural layout of the chip allows it to exhibit superconducting behavior at extremely low temperatures of 10-15 milliKelvin (mK). These dead-winter temperatures are achieved by employing a labyrinth of refrigeration systems comprising of dilution refrigerators and cryocoolers. It is imperative for the chips to be extremely cold to allow for precise measurements by reducing the energy possessed by the particles. Moreover, low energy particles are less predisposed to unwarranted movements, precluding any unintentional flipping of quantum states.
Now that we have tortured our brains with the esoteric concepts upon which a quantum computer operates, let’s save the rest of harassment for part II of this series. In the next article we’ll look deeper into the status-quo of the topic. In particular, the article will shed light on the feasibility of the edifice of using quantum physics for computational devices. Moreover, the piece would also address the possible applications of quantum computers and the deterring factors that could curb their development. Part II of this article will not only delineate how these counter-intuitive and abstruse concepts dovetail into a functional machine but also single out the dire need for these avant-garde machines. Until then, try cooling your PC to absolute zero temperature and see if it starts behaving like its quantum counterpart. Spoiler alert: it won’t.