For a long time, quantum computing remained exclusively in the realm of theoretical physics, not engineering or applied computation. This is no longer the case, as several real world, practical operations have recently been performed. IBM recently modeled the lowest energy state of beryllium hydride using a quantum computer in September 2017.
Although currently possible on a classical computer, being capable of successful molecular modeling verifies the concept that a quantum computer is capable of advanced computation. As quantum computers continue to increase their qubits, the quantum equivalent of bits, we’ll see progress in areas like chemistry, physics, and materials, far beyond what a conventional computer could have facilitated.
Significance to data centers; is the industry set to evolve?
All the press coverage for quantum computers has led many in IT to consider the significance for mainstream quantum computing. So will quantum computers disrupt the data center industry as we know it? Not likely. At least not without some miraculous discovery disproving everything we currently know.
Yet until then, a few key factors will prevent traditional data center servers from being replaced entirely.
The challenges of creating commercial viability
Quantum computers not only have limited flexibility for multi-purpose use, but there are some impracticalities.
For one, they’re very error-prone. We can’t “read” outputs from quantum computers in the traditional sense. Regular computers have binary inputs and outputs; they translate 1s and 0s back into something meaningful, and you can look at your data at any time, in any way you want.
With the “qubits” of quantum computers, things are a little trickier. Essentially, a qubit can be in a “superposition” of both a 1 and 0 at the same time. When you measure a qubit, it’s neither a 0 nor a 100% chance that you’ll get back a 0 or a 1. You can even measure two identical qubits and get different answers.
Additionally, any attempt to read the output as a whole makes the superposition fall apart (as in the famous Schrodinger’s cat example), so an indirect approach is required. Imagine removing one piece from the Jenga tower without knowing it's even the correct piece.
While some algorithmic outputs can be verified simply, such as Shor’s, others are more problematic. This means repeatedly running the calculations to get suitably close to the margin of error.
However, at the scale of quantum chips we’re currently dealing with, we’ve come a long way with error rates. Google’s bristlecone quantum computer recently demonstrated a one percent error rate for readout, 0.1% for single-qubit gates, and 0.6% for two-qubit gates. But, this is still only a 72 qubit computer. Even functions that quantum computing is especially good at, like factoring RSA keys of encryption, would require thousands of qubits. Commercial viability is cited as requiring closer to one million qubits!
Can Quantum compete with conventional computing logistically?
Quantum computers also require specialized equipment to maintain, making consistent functioning difficult at scale. Qubits are universally fragile. One type involving superconductivity requires being cooled close to absolute zero just to remain stable, while another involves holding trapped ions in place with lasers within a vacuum.
Beyond the issues with stabilizing error, even once we create a functioning chip, we still have to create the basic framework for mainstream computation. We need an architecture, compilers, and algorithms. Estimates for achieving this are currently being measured in decades, not years.
Say we do create a stable, 1 M qubit quantum computer, with systems, frameworks, and software that allows us to do things like generate a QuickBooks report. There is no real advantage; a conventional computer already does this as well as required. A quantum computer would simply be more expensive.
In fact, for most computer operations, there are no algorithms allowing quantum computers to defeat traditional computers; the number of steps performed would remain the same.
The future impact of this technology
Despite the aforementioned obstacles, quantum computers are more promising than traditional computers in some areas. Such as big data.
As raw data levels expand, it becomes increasingly difficult for data centers to analyze: something quantum computers could achieve in significantly fewer steps. Beyond simple data analysis, many have claimed that quantum computers will “destroy encryption as we know it.” For data centers, this sounds like a veritable nightmare.
Thankfully, this claim is only partially true. Certain algorithms will be vulnerable eventually. But, there are plenty of non-vulnerable encryption methods. Transitioning all of our sensitive data into a securely encrypted form, is a large, but not impossible task.
As artificial intelligence involves much of the same computation as data analysis, it could face disruption from quantum computing if a sufficient algorithm comes forth. However, this field will probably have progressed leaps and bounds long before practical contributions from quantum technology.
Will expected advances be disruptive to data center technology?
To summarize, the area of quantum computing is still largely theoretical. While we’ve made some incredible advancements in other areas, the IT sphere will almost certainly remain unaffected for some time.
When quantum computing technology does advance to the point of mainstream availability, it will likely take on a complementing style role to existing data center technology, rather than a disruptive one.