Published by
Mar 18, 2012

Into to Quantum Computing

Score: 4.2/5 (208 votes)
*****
This is by no means comprehensive, I was limited to 5 pages when writing this. But hopefully you'll understand a bit about the physics behind quantum computing, and some of the applications of it. At the very least, it'll be a reference point if you want to do further research.

NOTE: There wasn't a good category for this, so it is now a how-to article :D Works cited is in the file attached.

Quantum Computing


Current Technology

“Classical” computers all follow a similar design. They are based on the von Neumann architecture, and all exhibit what is known as the “von Neumann Bottleneck”. This basically means that current computers, without the use of the parallel programming, can only perform one action at a time. With current processors, this is works fine for most tasks. Processors can currently execute millions of instructions every second, and that number is constantly rising. Why is there a need for an alternate technology then? With Moore's Law stating that the density of integrated chips will double about every eighteen months, there does not seem to be much need to change. Eventually engineers are going to hit a limit as to how much they can put onto these chips. Transistors can only get so small. Not to mention Rock's Law, which states that the cost to build the plants to produce these chips will double every four year. So with the rising price of production and chip density reaching its limit, where do computers go next? There are numerous alternatives that have been theorized, the most promising and interesting of these is quantum computing.

History of Quantum Computation

The theory of quantum mechanics has been around for almost 200 years, but it is only been the last 30 years that these principles have been thought to be applied to computing. Many people credit Peter Shor as being the father of quantum computation. He was the first person to bring quantum computing theory to more of a reality with his algorithm, known as Shor's algorithm, but he was not the one to have the initial idea of quantum computers. The man who is credited with being the first to mention a quantum computer was Richard Feynman. He addressed the issue of classical computers not being well suited at simulating real world physics. His idea on how to fix this issue was to create a computer primarily of quantum mechanical elements which would obey quantum laws. A couple of other key figures in the development of quantum computation were Steve Wiesner and David Deutsch. Wiesner was a leading figure in looking at quantum cryptography and applied the uncertainty principle to this. David Deutsch showed that any physical property could be modeled perfectly in a quantum computer.

Quantum Physics for Computation

The main concept behind quantum computing is the qubit. A qubit is similar to a bit yet very different. A bit is restricted values of 0 and 1 only, and a qubit can represent also 0 and 1, but it can also represent or a superposition of both 0 and 1. The idea of superposition states that a quantum particle can exist partly in all of its possible states at once. This is what makes quantum computation so interesting. Superposition is what gives a quantum computer its parallelism, or ability to work on many computations at a single time. For example, according to Deutch, a 30 qubit computer would theoretically be able to operate at ten teraflops, or ten trillion floating-point operations every second! Classical computers today operate at gigaflop speed, or billions of floating-point operations per second. But this also happens on much more than 30 bits. A problem with this theory is that the moment a qubit is looked at in superposition, it will assume the value of either 1 or 0, essentially making it a fancy bit. It is also possible to accidentally “bump” a qubit when trying to look at it and change its value. The fix to these issues is another important concept in quantum theory known as quantum entanglement. Entanglement states that if some outside force is applied to two atoms, they will become entangled and one atom will assume the properties of the other atom. This allows physicists to look at an atom by looking at its entangled counterpart, removing the risk of “bumping” the atom holding your value. Without entanglement, quantum computers would be nothing but a really expensive and complicated digital computer.

Implications for Computer Scientists

Perhaps the most interesting aspect of quantum computing to computer scientists is how it affects algorithms and algorithm design. Quantum algorithms could solve some problems exponentially faster than any current technology algorithm and could solve any other problem at least as fast as current technology. One example of what quantum computers could solve much faster is number factorization. With current technology, factoring large numbers is computationally infeasible and this is off what RSA encryption is based. The ability of quantum computing to factor numbers much faster than classical computing has been demonstrated with Shor's Algorithm. Shor's Algorithm was demonstrated the first time in 2001 by a group at IBM. They used a quantum computer with seven qubits to factor out 15, which is the smallest number able to be factored by this algorithm. A reason why some calculations are much faster on a quantum computer is parallelism. Currently, parallel computing happens through the use of extra hardware and multiple computers, but with quantum computing it could all be done in a single processor. For example, if one takes one qubit in superposition and performs some calculation with yet another qubit in the same superposition, one would have four results. It would output 0/0, 0/1, 1/0, and 1/1 results. If one takes two qubits and perform an operation on two other qubits, one would get 00/00, 01/00, 10/00, 11/00, 00/01, 00/10, and so on results.

Pros and Cons of Quantum Computing

Quantum computing can have an interesting impact on the technology world. The most intriguing benefit is likely to be the inherit parallelism that quantum computing brings. Being able to perform exponentially more calculations at any given time compared to classical computers is important. The quantum parallelism can be considered both beneficial and consequential. Because of this parallelism, a quantum computer would be able to factor large numbers in a reasonably short amount of time, a task that is currently infeasible. The issue here is that some encryption techniques that keep a lot of important information safe, are based on the fact that factoring large numbers is currently infeasible. A quantum computer could very easily break any encryption protocol that relies on large numbers being extremely difficult to factor. This could, in theory, make all of this information no longer protected. Another benefit that comes from the quantum parallelism is being able to search large databases much faster than today. One other benefit that comes from quantum computing is true randomness. Currently in computers, random numbers are generated through a complex algorithm, so only pseudo-random numbers can be produced. One of the core concepts of quantum mechanics is the inherit randomness. For example, if a single photon is shot at a beam splitter, that photon will go one of two ways at an equal 50/50 chance. There is no way of determining exactly which way it will go, only an educated guess can be made. Aside from the benefits of quantum computing, there are some downsides to it. The obvious con being the sheer cost and complexity of developing them. In 2005, the research team of Rainer Blatt succeeded in creating a computer with a fourteen qubit register. So there is still a lot of time needed to see a practical quantum computer be made. There is just too much that can go wrong in the quantum world.

Hypothetical Future

Quantum computing is very interesting to think about because the possible advantages to it, but currently it is just too complex and error prone to become a practical system, however physicists are making a lot of progress in solving these issues. So what might a future of quantum computing look like? Quantum computers will not become the new PC, at least for quite a long time. But there will likely be large quantum computers that act as a server system. The personal computer world and Internet world have already become one, so having a series of “cloud” quantum computers spread out that people can connect to in order to perform some complex calculations is very believable. It is too early to say how much the average user will gain from advancements in this field, but it will definitely help large corporations and the science world. It would lead to advancements in the understanding of the quantum world in general. A quantum computer acting as a server would be near immune to denial of service attacks due to the amount of traffic a quantum computer with just a few hundred qubits could handle. This would also mean less servers would be needed to handle the world's traffic.

Conclusion

Current technology can only advance so far. Transistors have been made smaller and smaller each year since their inception, but they can only be so small before they reach the atomic size. Once this scale is hit, transistor technology will have hit its maximum potential. There are researchers working on possible solutions to this, but quantum computing has generated most of the attention. The sheer computing power that could be gained from using a relatively small amount of quantum bits is astounding. All this power can also lead to problems. A lot of the world's financial information is encrypted using a technique of generating large numbers that are not plausible to factor back down using current technology. With this new power our encryption system could be broken in a day. There may be some consequences of quantum computing, but the benefits can be seen to outweigh them. The next generation of computing is being made right now.