Building a quantum computer that can run reliable calculations is extremely difficult

Domenico Vicinanza, Associate Professor of Intelligent Systems and Data Science at Anglia Ruskin University, explains the difference between classical computing and quantum computing. The latters uses ‘qubits’ instead of bits, so instead of just being in the binary state of 0 or 1, then can be in either or both simultaneously.
Vicinanza gives the example of optimising flight paths for the 45,000+ flights, organised by 500+ airlines, using 4,000+ airports. With classical computing, this optimisation would attempted sequentially using algorithms. It would take too long. In quantum computing, every permutation can be tried at the same time.
Quantum computing deals in probabilities rather than certainties, so classical computing isn’t going away anytime soon. In fact, reading this article reminded me of using LLMs. They’re very useful, but you have to know how to use them — and you can’t necessarily take a single response at face value.
Quantum computers are incredibly powerful for solving specific problems – such as simulating the interactions between different molecules, finding the best solution from many options or dealing with encryption and decryption. However, they are not suited to every type of task.
Classical computers process one calculation at a time in a linear sequence, and they follow algorithms (sets of mathematical rules for carrying out particular computing tasks) designed for use with classical bits that are either 0 or 1. This makes them extremely predictable, robust and less prone to errors than quantum machines. For everyday computing needs such as word processing or browsing the internet, classical computers will continue to play a dominant role.
There are at least two reasons for that. The first one is practical. Building a quantum computer that can run reliable calculations is extremely difficult. The quantum world is incredibly volatile, and qubits are easily disturbed by things in their environment, such as interference from electromagnetic radiation, which makes them prone to errors.
The second reason lies in the inherent uncertainty in dealing with qubits. Because qubits are in superposition (are neither a 0 or 1) they are not as predictable as the bits used in classical computing. Physicists therefore describe qubits and their calculations in terms of probabilities. This means that the same problem, using the same quantum algorithm, run multiple times on the same quantum computer might return a different solution each time.
To address this uncertainty, quantum algorithms are typically run multiple times. The results are then analysed statistically to determine the most likely solution. This approach allows researchers to extract meaningful information from the inherently probabilistic quantum computations.
Source: The Conversation
Image: Sigmund