What Is Quantum Computing?
If you're already familiar with quantum computing, I recommend skipping to the next section. If you're not familiar with quantum computing, it is aptly named for its quantum properties. In quantum physics, particles do not have a defined location until they are observed. In classical computing, digital data is read in bits, which are 1s and 0s, or ON and OFF states, which we know as binary, which can be manipulated into different arrangements using various logic gates.
Quantum computing combines concepts from classical computing and quantum mechanics to make qubits (a shortened nickname for "quantum bits"). Unlike classical bits, qubits can be a 1 or 0 at the same time, much like Schrodinger's cat, which is in a state of flux until observed. So, four bits have 16 possible combinations (24), whereas four qubits can be in every possible combination at the same time until they are observed. This allows a quantum computer to perform every possible calculation at the same time. A quantum algorithm reduces the time required for large calculations by the square root of the number of entries being searched.
Quantum computers are not practical for most tasks handled by personal computers, but they excel at large-scale calculations such as searching databases, running simulations, and even breaking encryptions. The video below is the simplest explanation of quantum computing I have seen so far.
The 51 Qubit Quantum Computer and Cold Atom Physics
It seems like every few months, quantum computing reaches a new milestone. Last month, at the International Conference on Quantum Technologies in Moscow, attendees and reporters gathered in mass for Professor John Martinis' presentation of a chip embedded with 49 qubits. Instead, in a fashion that reminds me of Steve Harvey announcing the Miss Universe pageant, Mikhail Lukin, a Harvard professor and co-founder of the Russian Quantum Center made his own announcement and stole the show.
Lukin's team had successfully created the world's most powerful, functional quantum computer to date, which runs on 51 qubits. The device was tested successfully at Harvard, where it solved physics problems that silicon chip-based supercomputers were struggling with.
Most quantum computers have been designed using superconductors and even semiconductors. Martinis' 49-qubit chip was constructed in this fashion. Since traditional semiconductor materials are reaching their limits, Lukin's team took a different approach.
The 51-qubit machine uses "cold atoms" in place of qubits that are locked onto laser cells. Cold atom physics is the discipline of studying atoms at incredibly low temperatures (.0000001 degrees Kelvin) in order to recreate quantum conditions. Cooling atoms to temperatures near absolute zero slows their movement down and makes them easier to observe. The video below gives an introduction to cold atom physics (starting at 1:35). After that, we'll get into the biggest question I had about all of this:
How the heck do super-cooled atoms with lasers shining through them make a computer?
How Do Cold Atoms Make a Functional Computer?
Lukin's team wrote a research paper (PDF) explaining the experiment they set up. After sifting through the equations, I arrived at the data-reading mechanism. The setup consists of a linear array101 evenly spaced "optical tweezers", which are generated by feeding a multi-tone RF signal into an into an acousto-optic deflector.
In simpler terms, they shine a laser beam through a vacuum tube and take fluorescence images (a type of laser scanning microscopy) of the atoms as they change between positions. The "traps" that control the position of the atoms are programmable, which allows this super cooled vacuum tube with a laser shooting through it to function like a quantum computer.
As computing devices become ever smaller, engineers have been teaming up with scientists from other disciplines like physics and biology to make some outside-the box computing devices. Although it's unlikely that any of these will end up in personal devices anytime soon (or ever), it always reminds me that a computer is just a device that calculates problems, and what our concept of a "computer" will look like in 100 years might just be beyond our current levels of comprehension.
If you'd like to learn more about quantum computing, I've compiled some resources below along with some of my favorite outlandish non-silicon computers!
Quantum Computing Resources
Some "Out There" Computers and Logic-Based Devices
Featured image used courtesy of Kurzgesagt