The very basics of quantum computing and how they are designed. It will also provide accurate information about the principles behind quantum theory that helps quantum computing work. The objective is to help you understand this new age technology and its benefits.

Recommended Level

Intermediate

 

What is Quantum Computing? 

In 1981, at Argonne National Labs, a man by the name of Paul Benioff used Max Planck's idea that energy exists in individual units, as does matter, to theorize the concept of quantum computing. Since that year, the mere idea of manufacturing quantum computers for everyday use is becoming more tangible with new technological advances in quantum theories. Quantum computing focuses on the principles of quantum theory, which deals with modern physics that explain the behavior of matter and energy of an atomic and subatomic level. Quantum computing makes use of quantum phenomena, such as quantum bits, superposition, and entanglement to perform data operations. Computing in this manner essentially tackles extremely difficult tasks that ordinary computers cannot perform on their own. 

In classical computing, a bit is a term to represent information by computers. Quantum computing uses quantum bits or qubits for a unit of memory. Qubits are comprised of a two-state quantum-mechanical system. A quantum-mechanical system is one that can exist in any two distinguishable quantum states. Seeing the terms superposition and entanglement might be baffling, but it's okay: we don't experience these phenomena in our lives. Superposition is a principle that states while we do not know the state of an object at a given time, it is possible that it is in all states simultaneously, as long as we do not look at it to check its state. The way that energy and mass become correlated to interact with each other regardless of distance is called entanglement. 

 

Why do these phenomena matter?

Entanglement and superpositioning are extremely important in advancing computing and communications that can benefit us in numerous ways. These two phenomena can be used to process an extremely large number of calculations, whereas ordinary computers cannot. The power of quantum computing is astonishing and not many grasp the full capabilities it has to offer. While classical computers process information as 1's and 0's, quantum computers operate according to the laws of physics; this means information can be processed as 1's and 0's, or 1 and 0 simultaneously. This is possible because of the quantum-mechanical principle superposition. 

In the real world where bits are perceived as "0" o "1," only one of the four possible states can exist at any time in space. However, in a quantum superposition state, all four of the possible states can co-exist in time and space simultaneously. 

 

                                                           

 

Not only are these extremely interesting principles, they are essential in providing computing power for calculating difficult problems. One example is integer factorization using Shor's algorithm; it is composed of two parts, the first part alters the problem from finding the factor of a prime number to finding the period of the function. The second part of the algorithm is responsible for finding the period using the quantum Fourier transform. The problem can be assessed in five steps to deduce a prime factor of a given integer. 

1. A random positive integer m < n is chosen and the gcd(m, n) is calculated in polynomial time using the Euclidean algorithm. If gcd(m, n) ≠ 1, then a significant prime factor of n has been found and the problem is done. If gcd(m, n) = 1, then proceed to step 2.

 

2. A quantum computer is used to deduce the unknown period of the sequence. 

$$\text{x mod n, } x^{2}\text{ mod n, } x^{3}\text{ mod n, } x^{4}\text{ mod n, ...}$$

 

3. If P is found to be an odd integer, step 1 is repeated. If P is even, then proceed to step 4. 

 

4. Since the period P is even, 

$$(m^{P/2}-1)(m^{P/2}-1)=m^{P}-1=\text{0 mod n}$$

If mP/2 + 1 = 0 mod n, then step 1 is repeated. If mP/2 + 1 ≠ 0, then proceed to step 5. 

 

5. Finally, = gcd(mP/2 - 1, n) is computed using the Euclidean algorithm. Since mP/2 + 1 ≠ 0 mod was proven in step 4, it can be shown that is a significant prime factor of n

 

Below is an example of how  = 91(=7*13) can be factorized using Shor's algorithm 

 

1. A random positive integer m = 3 is chosen since gcd(91,3) =1 

 

2. The period is given by

$$f(a) = 3^{a}\text{ mod }91$$

Shor's algorithm is used to find the period on a quantum computer to find the period = 6.

 

3. Since the period is even, we proceed to step 4.

 

4. Since the equation does not equal 0 mod 91, we proceed to step 5. 

$$3^{P/2} + 1 = 3^{3} + 1 = 28 ≠ 0 \text{ mod } 91$$ 

 

5. See below

$$d = gcd(3^{P/2} - 1, 91) = gcd(3^{3} - 1, 91) = gcd(26, 91) = 13$$

 

Through careful calculation and the use of a quantum computer, the significant prime factor of  = 13 was found of = 91. 

 

Practicality of Quantum Computers

A company called D-Wave Systems is selling the largest quantum computer that has ever been built. Companies like Google and NASA have paid just over $10 million for the machine. While these companies have the funds to purchase such a device, its applications aren't as endless as one would think. Google is using the D-Wave Two machine to build a similar quantum computer that can only solve optimization problems such as artificial intelligence and faster web-search for users. While NASA is using the system to advance mission planning, pattern recognition, and air traffic control. 

As of now, Google and NASA have only been using the D-Wave Two to build their own quantum computing hardware for themselves. Research has been published stating that these quantum computers aren't actually performing any quantum physics mechanics; they are compared to be the same as classical computers and the quantum processor was sometimes found to be 10 times faster, but more than often it was 100 times slower than a classical computer. 

From these assessments, quantum computing machines are still long from being universally regarded as being true quantum computers. The physics and mechanics behind it have come a long way since it's discovery, but it also has a long way to advance. 

 

           

 

This technology is something that might one day be considered usable for everyone, but right now the computers are being designed for specific tasks only. D-Wave knows that they are far away from seeing an evolved machine that will be able to computer and process numerous tasks, but they are aiming their audience at top companies such as Google and NASA. As for this day in age of technology, they aren't applicable to everyone and a classical computer can certainly do just about everything that a quantum computer can do. 

Hopefully, this article has provided you with enough information in understanding the applications, concepts, and designs of how quantum computing is used by large companies and is very much in its infancy. If you have any questions or feedback, be sure to leave a comment! 

 

Comments

1 Comment


  • picopi 2016-03-21

    “Research has been published stating that these quantum computers aren’t actually performing any quantum physics mechanics; they are compared to be the same as classical computers and the quantum processor was sometimes found to be 10 times faster, but more than often it was 100 times slower than a classical computer. “
    I think says it all.