Article  | 

Computers prepare for the quantum leap


Reading Time: 5 minutes

Quantum computing will radically transform information systems

After the change brought about by the use of artificial intelligence and algorithms based on ChatGPT and derivatives in Internet search engines, computers are set to make the ultimate quantum leap, a drastic and vertiginous revolution in the calculation of data and its applications. Remarkable progress has been made in recent months in the development of quantum computers. Although it will probably take several years, or even decades, for them to become mature and truly effective, “there will be more quantum innovation in the next five years than in the last thirty years,” says Jay Gambetta, vice president of quantum computing at IBM.

Quantum computing is increasingly in the spotlight. Recent advances in artificial intelligence have shown that major scientific and technological breakthroughs don’t always occur gradually and predictably, but sometimes several very significant advances take place in a short time with a spectacular result, as has happened in the past in different fields of science, technology or art. It remains to be seen whether progress in quantum computing will be dizzying in just a few years or whether the multiple challenges it faces will cause a slowdown in its commercial development.

Quantum computing is based on the probabilistic nature of the wave-particle duality (or corpuscle) and that of the superposition of states principle of quantum physics. Quantum physics was developed just over a century ago by Einstein and other eminent physicists, with results that still defy logic and our understanding of reality, but everything points towards the rules of quantum physics working on a microscopic scale. Nonetheless, the rules of Newton’s classic physics of the 17th century are perfectly adequate for explaining the reality that surrounds us.

While a traditional computer processes data in elementary units of information (ones and zeros, or bits), quantum computers work with qubits, which means that a one can also be a zero at the same time, and vice versa, according to the principle of quantum superposition. Because a piece of data can exist in multiple states, apart from the fixed sequence of zeros and ones, a quantum computer has the ability to perform multiple operations simultaneously, rather than one at a time.

A major breakthrough came in 2019, when Google announced the development of Sycamore, a 53-qubit quantum chip, which runs on superconductors and at temperatures very close to absolute zero (0.015 degrees Kelvin or minus 273.135 degrees Celsius). Last November, IBM revealed its new 433-qubit Osprey chip, and this year’s plan foresees a 1,121-qubit chip and by 2025 they plan to exceed 4,000 qubits. Similar developments are taking place in other countries, especially in China. However, the higher number of qubits doesn’t necessarily mean that the device can perform truly useful calculations because the results depend on the quality of these qubits and their connectivity.

Software that corrects the multiple errors that occur is just as important, if not more important, than quantum chips.

According to some experts, Google’s Sycamore is a well-balanced chip, because it has a relatively small number of high- quality qubits that are well interconnected. Quality qubits are those that don’t generate too much completely useless and error-ridden information but allow some valid calculations to be performed. The main problem with today’s quantum computers (there are around 60 in operation worldwide) is that they are capable of generating a lot of information but are of little use because there’s no way to separate the wheat from the chaff or to rely on the results presented.

Errors are inevitable, and even more so in a quantum computer, with non-binary information that has multiple states. Therefore, it’s essential to minimise and correct these errors with the help of specific software during the information processing phase, as theoretical physicist Zaira Nazario, who specialises in quantum computing at IBM’s Watson research centre, pointed out in an article in Scientific American last May.

Thus, having a chip with many qubits connected at a location close to absolute zero and supplying information isn’t enough. It must be coupled with sufficiently powerful software that can detect and discard errors in what has been processed. For Nazario, developing software capable of correcting the inevitable errors of quantum computers is even more important than having more powerful chips. This gives an idea of the colossal challenges facing the development of quantum computers.

The near-limitless computing power of quantum computers promises spectacular advances in many fields, such as weather and seismic disaster prediction models, the search for new medical treatments and resources, or real-time simulation of digital device functioning, because they’re infinitely faster and more accurate than current supercomputers.

However, if these advances do occur and the results are reliable, they can also have undesirable effects if these powerful computing tools fall into the wrong hands. National security agencies are already becoming concerned that current cryptographic systems may become obsolete. Many experts already advise that cryptographic systems be designed now to make information fully secure and proofed against the quantum computers of tomorrow. Some even consider it too late in view of the great advances being made in quantum computing and the fact that more and more systems that were originally thought to be absolutely secure are being breached, even with today’s supercomputers.

The enormous computing power that quantum computers are expected to have may put at risk the confidentiality of data in operators’ telecommunications networks and data centres. To minimise this risk, the GSMA, the association that brings together most of the world’s telecommunications operators, announced last September the formation of the Post-Quantum Telco Network Taskforce, to “help define policy, regulation and operator business processes for the enhanced protection of telecommunications in a future of advanced quantum computing”.

“Given the accelerating advance of quantum computing, data and systems protected by current cryptographic methods could be obsolete within a few years,” said Scott Crowner, vice president of business development and adoption of quantum computing at IBM. For this reason, Crowner welcomes the GSMA’s initiative to create a task force to minimise the privacy risks of quantum computing. IBM and Vodafone are part of this initiative and other operators and developers of quantum computing hardware and software are expected to join in the near future.

Several countries have been engaged in the development of quantum computers for some time now, especially China, which, according to the World Economic Forum, last year invested about half of the estimated $30 billion worldwide total. Another quarter, some 7.5 billion, was invested in Europe. This makes the amount invested by the United States smaller than is generally believed, at least at the state level. A recent article in Time magazine said that the U.S. National Quantum Initiative invested $1.2 billion last year, compared to the $1 trillion the U.S. spends annually on defence.

It’s not clear whether private investments in quantum computing are included in these country figures. But we know that US companies Google, Amazon and IBM, and China’s Alibaba among others, are investing a considerable sum in the development of quantum computing at their own expense, but not necessarily in national security. The quantum computing industry is certainly on the rise. Consultancy firm IDC projects an industry turnover of $8.6 billion in 2027, up from $412 million in 2000.

As we mentioned, there must be advances in both hardware and software for there to be a true quantum revolution. Enrique Lizaso is the CEO of Multiverse Computing, a San Sebastian-based company specialising in the development of software for quantum computers that will participate in the MWC in Barcelona in the Spanish pavilion; he says that the learning algorithms of quantum computers are completely different from those used in traditional computer data centres. “There is a lot of work to be done to achieve reliable reporting of the results presented by quantum computers,” he says.