Lately, the news has been full of quantum computers. For example, the Chinese have recently sent up the first quantum communications satellite. That leaves some asking, “How long until we see these these quantum machines powering our homes and in our pockets?” Some futurist claim that we could see a major breakthrough any day now. I’m not pessimistic about such claims, but a cold hard reality remains: we still have major hurdles to jump over before that day comes.
First, why is it so important that we develop quantum technology? Anyone who is familiar with Moore’s law and the development cycle of hardware can rattle off the answer pretty quick. Our modern computers rely on many small transistors handling electrical current. Moore’s law states that every 6 to 18 months, the number of transistors inside of a processor doubles, but the physical size of the processing chip stays roughly the same.
What we know from Moore’s law is this: transistors can only get so small and still remain stable. That size is approximately seven nanometers. After that, a transistor becomes unstable and virtually useless. When we can no longer make transistors using current conventions, computer processors of the same physical size can no longer become more powerful. This is where quantum technology comes in.
Quantum technology relies on the weird properties of the atomic world discovered by Einstein, Bohr, and many other great physicists. Once we enter into information and physics at the atomic level, classical Newtonian physics slips away. We make an entrance into a world where electrons can be anywhere at once, act as waves or particles, and even transmit information faster than the universal speed limit—the speed of light.
The base of quantum technology mostly revolves around the properties of electrons. Anyone with a background in basic electrical theory knows that the charges of electrons make modern circuits work. Quantum technology, though, relies on additional properties of the electron: how it spins, where it is located, and how it is entangled (will touch on later).
In big universities and companies, quantum computers are in development. Because of this, some ask, “Why we don’t have them at home?” Although it’s possible to have a quantum computer in your house, the cost of running it would be astronomical. Quantum computers have to be cooled to almost absolute zero—the temperature at which the motion of particles comes to a stop. Here’s a quick way to imagine such a low temperature: picture a room so cold that when you hit the light switch, the electrons are too cold to move energy through the light bulb. To run a quantum computer, you need a refrigerator on steroids, which, even for a day of operation, would bankrupt most people.
Even though it’s expensive, quantum technology still has great advantages to conventional technology. Our current technology sees everything in the form of a binary value system. Either something is on or off, determined by either a high or low voltage respectively. Quantum technology, on the other hand, relies on qubits (short for quantum bits.), which depend on the quantum properties of the electron. One qubit is able to store a much larger amount of states than our current binary system.
So, what about all that entanglement business and communication faster than the speed of light? A great literary example of this comes in the form of Orson Scott Card’s Ender’s Game series. In the last three books of the Ender Saga, the titular character has a connection to the queen buggar, destroyed from the first book. She still exists, but in this place call the ansible—a network that buggars used for the queen to communicate and control all of her minions instantaneously. This is a prime example of what quantum entangled communications would be. In this form of communication, two or more electrons attain a mysterious connection, which we still don’t fully understand, and take on the same properties. If we take one electron and reverse its spin, the entangled one will reverse its spin instantaneously, regardless of distance.
All of these new technologies are still a long way off from entering our daily lives. However, there are solutions that are often less spoken of. Although we don’t see them on the news, they will solve our dead end road with current technology. The solution is simple: we just change how we do things. So, is quantum technology still the answer for the future of our laptops, desktops, phones, and tablets? Maybe not.
One obvious solution is to make the physical size of processing chips bigger. To double power, we could just double size. There is an issue with this, though. At one point, do we go back to the size of computers used by universities, scientists, and governments that take up full rooms? I see this as a short term fix. In the long term, it’s not preferable.
Perhaps one of the best solutions would be to change the architecture. It would require replacing lots of systems, but it could buy time to keep advancing past Moore’s law while we figure out how to do this quantum thing efficiently and cost-effectively.
Right now, current architecture is based on the Von Neumann architecture, developed primarily by John von Neumann. It relied on a design that became known as the stored-program concept. This design was the major step forward in advancing the use of a binary system and the development of the interaction between a computer’s RAM and processor.
Just because Von Neumann’s architecture has reigned for so long doesn’t mean it has to be continued. If it weren’t for Moore’s law, we wouldn’t have had to seek alternatives: something that Neumann probably wouldn’t have imagined happening anytime soon. Alternatives have been proposed over decades, but rarely have they been fully implemented. This article won’t cover all of these alternatives, but a brief look of a few options will be taken.
One proposal recommended a changing to a ternary-dominant system. As mentioned earlier, on and off is determined by either a high or low voltage. The proposal suggested to expand the voltages read. For example, instead of simply reading a high or low voltage, a processor could instead use three settings: high, medium, and low. To relate this to the binary system, instead of either being a 0 or 1, a bit would have the ability to be a 0, 1, or 2. Depending on sensitivity and stability, we could create additional voltage states: high, high-medium, medium, etc. This is just one architectural change suggested over the decades.
Each suggested change has new ways of computing. Some change the RAM-CPU relationship by adding additional components that do new process to take some of the workload off the RAM and CPU. An example of this would be having the graphics processor become more involved with regular operations, which some processes make use of this technique today. These are just a couple of the many options that have been brought forward by engineers. Some of these alternatives have been discussed over three decades ago. However, even with their age, it is never too late to consider these suggestions. After all, our current model of computing has been in place for over fifty years!
We don’t need to fear the coming of Moore’s law. There are plenty of alternatives that we could consider. However, there is a real possibility that we could all be using quantum technology soon. Instead of holding the quantum device in your hands, major tech companies could offer quantum cloud computing services. To me, this is the most foreseeable future in the next 10 to 15 years. At this point, the only thing we can say for sure is that time will tell.