This is the final article in a seven-part series on Algorithms and Computation, which explores how we use simple binary numbers to power our world. The first article, How Algorithms Run the World We Live In, can be found here.
As a millennial elder, I came of age just as the most important human invention since the wheel started showing up in our mailboxes in the mid-1990s: software CDs offering free trials for services like America Online, Compuserve, and Prodigy. Those first exploratory steps into this revolutionary digital space came when we were old enough to clearly remember life before the Internet but still young enough to embrace the technology in ways our parents couldn't.
We gladly ran up thousand-dollar credit card bills binging on chat rooms, message boards, instant messaging, and other primordial internet content--that's right kids, back then we had to pay for the Internet by the hour--but that was mom and dad's problem, we had an entire civilization-altering transition to participate in. Transformational progress on a global scale normally takes time, generations even, to achieve but we pulled it off in less than a decade and spent another decade pushing the limits of what was possible with a computer and an Internet connection and, unfortunately, we began running into those limits pretty quickly.
The Rise and Decline of the Classical Computer
For all intents and purposes, the Internet is the tour de force of classical computing. Networked together, billions of computers of every shape and size collaborate through algorithms, radio signals, and fiber optic cable to produce a way of life that as far as we know is unique in the universe. Even more incredible is that classical computing accomplished this in less than two generations of human beings, a rate of technological progress without historical precedent.
RELATED: WHAT WILL QUANTUM COMPUTING CHANGE, EXACTLY?
For 40 years, Moore's Law drove the unprecedented human progress of the post-war era, but a silicon computer chip is a physical material, so it is governed by the laws of physics, chemistry, and engineering. After miniturizing the transistor on an integrated circuit to a nanoscopic scale, transistors just can't keep getting smaller every two years. With billions of electronic components etched into a solid, square wafer of silicon no more than 2 inches wide, you could count the number of atoms that make up the individual transistors.
Intel's recent troubles with glaring security vulnerabilities in their processors is a direct result of engineers having to try to come up with creative ways to improve processor performance and speed when it's no longer possible to physically improve the integrated circuit itself. As transistors have shrunk to just 7 nanometers long, engineers have gotten us to the point where transistors use the fewest number of atoms possible to build a working component. Any smaller, and the structural integrity of the transistor would quickly break down and lose their ability to contain and direct the electrical current that conveys the information that makes computers so powerful.
Computers have never been faster or more agile when it comes to the switching and manipulation of the electric current that powers its operations, but you just can't get electrons to move at a speed other than the one determined by the medium it is traveling through. The only way to "speed up" the flow of electrons is to reduce the distance it has to travel between logic gates so that operations produce results a few trillionths of a second faster than before, which is what we've been doing for 40 years.
Modern computer processers are undeniably fast, but unfortunately it's not fast enough. Despite its incredible power, the classical computer has been effectively defeated by the mathematical realities of intractable but critically important problems like optimization and protein folding. The sequential nature of classical computer operation means that, on their own, they will never be able to overtake the rate of growth of an O(2n) or O(n!) problem.
No one wants to accept that the incredible technological ride we've enjoyed for the past half-century is coming to an end, but unless algorithms are found that can provide a shortcut around this rate of growth, we have to look beyond the classical computer if we are to maintain our current pace of technological progress.
Hype Around Post-Classical Computing Sounds Utopian, but Surprisingly Justified
Quantum computing is a subject that a lot of people, myself included, have gotten wrong in the past and there are those who caution against putting too much faith in a quantum computer's ability to free us from the computational dead end we're stuck in.
The technology is in its infancy and there are any number of reasons to doubt that we'll ever see the quantum computer equivalent of the Apple II home computer. It's not just the qubits you have to master; you'd also have to discover a material capable of room-temperature-ish superconductivity as well as figure out how you'd maintain an internal environment for the qubits that needs to be kept as close to absolute zero as possible to work.
Besides, the vast majority of the work a computer needs to do won't be performed any faster on a quantum computer than on a classical one. Sequential operations aren't the kind of thing that quantum computers are designed for, so long after quantum computers fully arrive, we will still use classical computers for the foreseeable future while quantum computers will likely stay in corporate and national labs with processing services provided through cloud computing on an algorithm by algorithm basis.
For all the work needed to create and maintain qubits in superposition, quantum computers don't really do much of anything at the moment and that will probably remain the case for a little while longer at least. You'd be forgiven for thinking that quantum computers are a whole lot of hat and no cattle, but that would also be a serious mischaracterization of the state of the technology and glosses over the significance of what we already know is just now coming over the horizon.
One of the strengths of mathematical systems is their provability with logic. If we can prove a thing to be true logically, that truth will never change. This is the kind of thing that gives us the confidence to build rockets and space craft that can be piloted with almost pinpoint precision from four and a half billion miles away, and it's why we can say that quantum computing won't just be transformative, we can tell you exactly why.
In the 25 years since Peter Shor published the first quantum algorithm--which showed that prime factorization of integers could be done on quantum computers in polynomial time--mathematicians and computer scientists have developed other quantum algorithms that tackle problems that classical computers struggled to solve. Of those dozens of quantum algorithms, many of them are orders of magnitude faster than the most efficient classical algorithm we know of and are only possible because of the unique quantum environment they operate in.
Some of the most important work in the quantum computing field has been creating algorithms that simulate different quantum systems that pop-up in everything from laser technology to medicine. These algorithms are going to be able to outperform similar classical computing simulations by a wide margin. Currently, classical algorithms that perform molecular simulation are limited in the kinds of molecules that it can simulate. These algorithms are usually restricted to molecules with fewer than 70 spin-orbitals, any more than that, and the complexity of the simulation grows so quickly as to become intractable.
Meanwhile, a single qubit can represent one of these orbitals efficiently enough so that a quantum computer with only 100 qubits--the D-Wave 2X quantum computer has 1152 qubits, though it was built to run a spacific algorithm, not as a general purpose quantum computer--would allow for molecular simulations that classical computers aren't even close to being capable of simulating and probably never will. These simulations can potentially reveal all manner of previously unknown compounds that can provide novel therapies for any number of diseases.
There are quantum algorithms for everything from depth-first searches and quantum walks over a graph to solving systems of linear equations, differential equations, and even making progress on certain classes of optimisation problems, such as adiabatic optimisation. What these algorithms lack, however, is a sufficiently powerful quantum computer with enough qubits to run.
That won't be the case forever, though, and when the time comes to take these algorithms off the shelf and put them to work, some of the most frustratingly intractable, exponentially- and factorially-complex problems in business, administration, medicine, engineering, and more will be resolved in superpolynomial time or faster. These gains are the real deal and they're gauranteed by their logic to work; the only question is just how long it will take for these computers to arrive.
Redefining the Computer for the Post-Classical Age
The problem facing classical computers going forward is intrinsic to the electronic nature of the computers themselves. Evolving from simple electronic circuits, computers use a very specific computational methodology to solve problems and so it's permanently locked into the sequential binary number calculation model that electronics have been using for more than a century. This model's dominant place in our technology doesn't mean it's the only way to perform computations.
Spintronics, which uses the spin of electrons and the magnetic properties this spin produces, is showing the most promise as a storage mechanism owing to its imperviousness to external magnetic disturbances, the kind that can erase entire hard disks that rely on current ferromagnetic data storage technology.
The magnetic qualities of electrons also suggests that a spintronic, semiconducting transistor could be built that could jolt Moore's Law back to life, at least for a while. Atoms might be small, but they're pretty much all nucleus. The electrons that orbit the nucleus meanwhile are orders of magnitude smaller than an atom itself, so it should be possible to pack thousands of times as many spintronic transistors onto current silicon chips, giving classical computers the opportunity to get around that whole law of physics and chemistry problem.
Moving away from our obsession with silicon chips, there is another major area of computational research that holds an incredible amount of potential. DNA computing might seem confusing and possibly a bit weird at first glance, but if you think about it, it's an obvious candidate for post-classical computing research and development.
Ever since the first strands of DNA encoded the instructions for the creation and operation of single-celled organisms, it has grown into a powerful mechanism for the transmission and storage of data, but researchers are now digging deeper into the individual building blocks of DNA itself, and it's potential as a computational mechanism in its own right.
Research has shown [PDF] that the four distinct amino acids--A, T, C, and G--that serve as the building blocks of DNA can be repurposed to act as encodable bits. When mixed, these amino acids naturally self-assemble into strands of DNA and not just any DNA but all the different permutations of DNA possible with the available materials.
This is a potentially game-changing innovation since performing operations on a superposition of qubits is not the same thing as true parallel computing. Quantum computers will only give you a single output, either a value or a resulting quantum state, so their utility solving problems with exponential or factorial time complexity will depend entirely on the algorithm used.
DNA computing, however, harnesses these amino acids' ability to build and assemble itself into long strands of DNA. Mix these amino acids, and they will naturally build into longer and more complex permutations of the set of amino acids. If you've been following the series, those words should have jumped out at you. Permutation is a process with factorial time complexity, and it is the fundamental challenge that needs to be overcome if we are going to solve an NP-complete problem. Permutations are all about optimization, and it's probable than even a quantum computer will find optimization beyond its power to solve.
That's what makes DNA computing such an exciting new development. If we encode the name of a city in the traveling salesman problem as some combination of amino acids and throw all of these amino acids into a beaker, once they start self-assembling into strands of DNA, the correct solution to the traveling salesman problem will grow organically out of this process.
In less than a minute, the solution to the traveling salesman problem will be sitting in that beaker in the form of a strand of DNA, and the challenge becomes finding a way to filter out the wrong answers until we can isolate this optimal solution. Filtering out the countless number of incorrect strands of DNA to find the optimal one is no small task, without question, but it's also not a problem of permutating every possible strand of DNA. As we saw in Shor's algorithm, sometimes the key to finding the solution to an intractable problem is to turn it into an equivalent problem that is easier to solve.
While this is still a computationally hard thing to do, it's much simpler than brute forcing permutations and validating them afterword to find the best route for our salesman to take. Ongoing research into DNA computing will reveal in time its true efficacy, but the self-assembling DNA strands offer the promise of true parallel computation, something that even quantum computing cannot claim.
We are Quickly Approaching a Technological Event Horizon
It is entirely possible that before we see any of this, humanity will end up bombing itself into a new dark age that takes thousands of years to recover from.
It's important to remember that while progress isn't guaranteed, change always is and the kind of technological and scientific retreat this new dark age would represent is the only comparison I can make that captures the scale of the change that may come from the transition into the post-classical era.
Humanity is genuinely approaching nothing short of a technological event horizon. There is something on the other side of the classical-post-classical divide, it's likely to be far more massive than it looks from over here, and any prediction about what we'll find once we pass through it is as good as anyone else's.
While it can be fun to speculate about specific advances, what will ultimately matter much more than any one advance will be the synergies produced by these different advances working together. Synergies are famously greater than the sum of their parts, but what does that mean when your parts are blockchain, 5G networks, quantum computers, and advanced artificial intelligence?
Whatever it ends up being, it will redefine computation as we know it and the implications of the new systems we'll create by integrating these various models will be so immense that the only thing I know for certain is that our passage into the post-classical world is guaranteed to be a one-way trip.