IBM Q: Evolving Quantum Computing and its Ecosystem

By Charles King, Pund-IT, Inc.  March 8, 2017

The IT industry is seldom good at describing how new products and entirely new classes of technology come into being. This is partly a matter of industry culture and partly due to vendors exerting control over their own brands and narratives. In other words, vendors prefer to be applauded for the brilliance and foresight that goes into successful products rather than acknowledge the blind alleys and rats’ nests that are a common part of the development process.

But now and again something comes along that is so thoroughly different than what has gone before and offers such clear opportunities for advancement that its evolution becomes part of common discourse and it “grows up in public.” Quantum computing, and the new IBM Q efforts are a good example of this.

The road to IBM Q

As noted in this week’s IBM Q press release, the company has spent decades on quantum computing research, including fundamental efforts in quantum processor fabrication, system design, materials development and software and programming tools. Those efforts came to a head in May 2016 when the company launched the IBM Quantum Experience, a service available through IBM Cloud.

The service allows users to run algorithms and experiments on IBM’s quantum processor, work with individual quantum bits (qubits), and explore quantum computing tutorials and simulations. Since its launch, about 40,000 users from over 100 countries have run more than 275,000 experiments on the IBM Quantum Experience. Plus, 15 related third party research papers based on projects have been posted to arXiv, with five of those published in leading journals.

Let’s take a look at what IBM announced. First and foremost, the company released an initiative and roadmap for building “commercially available universal quantum computing systems.” What does that mean exactly? In IBM’s view, quantum computers come in three forms:

  • Quantum Annealers ranging in size from 1 to 49 qubits that are mainly used for optimization problem experiments and deliver performance analogous to “classical” computers;
  • Approximate Quantum systems ranging in size from 50-100 qubits that are notably faster and more powerful than any classical systems and are useful for a range of quantum chemistry, materials science, sampling, quantum dynamics and optimization problem applications; and
  • Universal Quantum computers that can scale to thousands of qubits, are massively (by several orders of magnitude) faster and more powerful than classical computers and can be applied across the same use cases as Approximate Quantum systems, as well as in secure computing, machine learning, cryptography, and advanced searching.

In other words, IBM has found viable ways to scale the technologies initially developed for the Quantum Experience into saleable, commercial Universal Quantum systems, and has a workable roadmap in place. That is a very big deal.

Evolving quantum performance and demand

Can IBM accomplish this? It seems unlikely that the company would make such a claim unless it believes it has a way forward. In fact, telling tales or outright lying about such a high-profile subject could seriously damage the company’s credibility with customers, partners and research communities—long term risks that would far outweigh any short term gains.

This week’s announcement also noted two other sizable steps leading toward its quantum computing goals:

  • A new application programming interface (API) that will enable developers to build interfaces between the five qubit Quantum Experience system and classical computers, and
  • An upgraded simulator for the Quantum Experience that can model circuits with up to 20 qubits, along with a full software developer kit (SDK) for building simple quantum applications and programs

In addition, IBM described its engagements with researchers and academic institutions, including MIT, the University of Waterloo and the European Physical Society to leverage the IBM Quantum Experience as an educational tool. It also described some of the work it is doing with Samsung, JSR, Honda, Canon, Hitachi Metals and Nagase through the IBM Frontiers Institute, a consortium that develops and shares groundbreaking technologies, including quantum applications, and evaluates their business and industrial implications.

In other words, along with actively developing commercial Universal Quantum computers, IBM is also proactively building the markets and demand for those devices. This may seem simply mercenary to some people but it illustrates how technologies must be commercially grounded in order to succeed and evolve.

IBM’s efforts also offer a great example of how to bring an essentially unique product to market. Since quantum computers are significantly different in form and function from classical systems, commercializing them requires sizable education efforts by vendors, universities, businesses and industries. How can people imagine, let alone undertake to build something entirely new unless they understand the tools at their disposal?

Why quantum computing?

The concepts behind quantum computing have existed for decades, but progress has been so slow, the technologies required so esoteric and the cost of investment so high that few have seriously pursued quantum computing development.

IBM has long focused its energies on quantum computing and helped deliver notable research breakthroughs on subjects, including the physics of quantum computation (1981), quantum cryptography (1984), quantum teleportation (1993), criteria for building a quantum computer (1996), experimentally factoring (2001), coherence time improvement (2012) and quantum code (2015).

So what drives IBM’s commercial interest in the subject? Several things. The massive improvements in parallel performance offered by quantum computers are certainly eye-opening. In fact, it has been estimated that the performance of a modest 30 qubit quantum system would equal a classical computer running at about 10 teraflops, nearly allowing it a place among the top ten systems on the most recent Top500.org list of global supercomputers.

However, the core functionality of quantum computers allows them to tackle problems that are too complex and exponential for classical systems to handle. According to Dr. Jerry M. Chow, a manager and staff member in Experimental Quantum Computing at IBM’s T.J. Watson Research Center who recently briefed analysts on the IBM announcement, while “Classical computers are really good at crunching through known data, quantum computers excel at working with and exploring experiential data.”

For example, quantum systems could revolutionize drug and materials discovery by vastly speeding computational processes and enabling new forms of analyses to unlock new medicines and materials for industry. Quantum capabilities could also enhance data security with cryptographic codes unbreakable with conventional systems. Supply chain and logistic processes, financial services and artificial intelligence are other areas that could profit from quantum breakthroughs.

Is it any wonder that IBM is putting substantial investments and energies behind IBM Q?

Final analysis

Good enough, so how soon will we be seeing IBM Q deliver those and other breakthroughs? Frankly, despite the substantial progress IBM has made, quantum solutions and their potential markets are still in very early days. But examining the history of any transformational technology—time and navigation instruments, internal combustion engines, power generation and delivery, railway and road construction, computer hardware and software—reveals how crucial research and investment are to progressive achievements.

IBM isn’t the only vendor focusing on quantum computing but this latest announcement suggests that the company’s efforts are gaining substantial traction and momentum, and will eventually result in both commercially viable IBM Q systems and a ready market for those solutions.

© 2017 Pund-IT, Inc. All rights reserved.