In the IT world, quantum computing is becoming the subject of ever more frequent discussion. And with a good reason as this somewhat exotic sounding technology is about to cross the threshold to commercial use

At the beginning of March, a headline went round the IT world that scarcely attracted attention. An IBM press release stated: “We have doubled the quantum volume of our Q System.” Only those with inside knowledge of this technology could judge the significance of this statement, namely that it represents a milestone on the road to commercial use of quantum computing. IBM had already presented its Q System and reported on corresponding pilot projects at the company’s last “Think” customer event as well as at the CES trade fair in January. If one believes their marketing managers, quantum computers will soon be working veritable miracles. It is claimed that the performance of these systems will exceed that of today’s computers several times over. Applications that are mentioned time and again in this context are, above all, simulations, optimisations and IT security. The latter is seen in both a positive and negative light. That is to say, quantum computers can crack all of today’s security measures – but can also introduce new security standards that no quantum system can crack.

Need for research and explanation

The fundamental questions regarding this technology and architecture are still the subject of extensive research. Analysts from Gartner, for example, are of the opinion that this technology has not yet got beyond the first of five innovation phases. And so it is no wonder that quantum computing is still regarded as pretty exotic both among the general public and within the IT world. This can be attributed to the coming together of many new or at least unfamiliar basic structures with regard to the way the system works – but also to some misleading definitions.

Let’s start by describing this technology in a way that is relatively easy to understand. The simplest way is to begin by pointing out the differences to a conventional computer. The most important point is that quantum computers are analogue computers and have the associated problems with accuracy. The computing result of a quantum computer is measured physically and so the system is simultaneously reset to its original state. The basic components are so-called qubits. The major difference between a qubit and a “normal” bit is that a qubit can not only assume the states zero and one but also a lot of intermediate states. To get a rough initial idea, one can imagine a qubit as the hollow space within a sphere, whereby certain points within the space are assigned to a logical value. For example, the top point on the z-axis (where x=0 and y=0) has the binary value zero and the lowest z-point has the value one. The state of a qubit is then a combination of the values along all three axes. This is called superposition. In some explanations it is stated that a qubit “can be in all possible states at the same time”. But that is misleading because the qubit only has one state at any given time – but the state is not known as long as it is not measured. In any event, a quantum computer really can perform mathematical calculations of the qubit while it is in a superposition. To do so, the probabilities of the state are taken and changed by logic gates before a result is finally read out by measurement. As soon as a qubit has been read, it goes back to the initial state of one or zero and loses other information on its state.

That alone is already a remarkable difference to conventional binary systems. But there is more. The great significance of qubits is based on entanglement. The phenomenon of entanglement is given when the individual qubits are no longer a combination of independent one-part states but represent just one shared state. That means a change to a qubit also automatically brings about a change to the qubit entangled with it. Their shared state is therefore a superimposition of all individual states. When compared with conventional computers, the entanglement of qubits corresponds to a controllable gate connection of every single bit in the system with every other bit. This makes programming quantum computers particularly complicated. The systems are controlled using sequences of different types of logic gates. Moreover, the programmes must be executed quickly enough so that the qubits do not lose their state before it has been measured.

From the laboratory out into the real world

Researchers have had these fundamental processes under control for some time now, but a stable, large-scale system also has to work outside of the sterile environment of the laboratory. Interference from the environment is particularly problematic. As the processes are analogue and work with magnetic field control, shielding and process stability are of extreme importance. Otherwise, major errors can occur. Consequently, there are several factors involved in a system’s qubit volume. It is defined by the number of qubits, the connectivity, the superposition time, the error rate with the gates and with the read out. And this is the value that IBM succeeded in doubling in the Q System from 8 to 16, which means that the figure has doubled from year to year over the last few years.

Announcing this improvement in performance, Dario Gil, Director of the IBM Research Laboratory, was delighted to announce. “We have now achieved the all-time lowest rate in measurement errors.”

As indicated at the beginning, this represents a major step towards the commercial usage of quantum computing because, as they say, “size matters”. As Gartner says in a research note: “There are certainly interesting prototype applications, but the tasks that the system has been able to handle so far have been very small. The systems have to become much bigger to be able to handle more large-scale problems”. Then these tasks really will be of great interest. A great deal of experimentation is going on already, especially in the automotive industry. For example, Daimler is working with IBM’s Cloud-based Q System – qubits-as-a-service, so to speak. “We want to use quantum computing to look into specific issues concerning future mobility and help in the search for new materials,” says Ben Boeser, Director Open Innovation at Daimler’s R&D laboratory in the United States. This mainly concerns battery technology, where it is hoped that the simulation of complex electrochemical processes will help to prepare the ground for lighter batteries with a higher capacity and shorter charging times. Exxon Mobile is also collaborating with IBM in quantum computing. The oil giant wants to use the system to conduct seismic simulations so that geological structures can be better mapped. A total of 40 Fortune 500 companies around the world and numerous universities and research facilities are working with IBM’s quantum system.

Volkswagen is also involved in the application of quantum computing. In San Francisco, VW has assembled a 15-strong team of experts led by Florian Neukart to find out how quantum computing can benefit the company. The main focus here is on optimisation problems and simulations. Although such problems could also be solved with conventional technology, Neukart is of the opinion that this would simply take too long. However, this issue is not only relevant to road traffic but also to production. For example, when the task at hand is to distribute a limited number of tools so that they can be used to the greatest possible benefit.