These tech giants are helping researchers with advanced quantum computing.
Quantum Computing is not a new scientific term, but it always ceases to fascinate everyone, the techies and the non-techies. Quantum computers have the power to solve complex problems quicker than classical computers. For example, quantum computing is used in the pharmaceutical industry to boost vaccine production.
Yet, quantum computing has its own quirks. Even in the fastest quantum computers, there are about 100 qubits, bounded with random errors. In 2019, Google made a statement with its 54-qubit quantum computer that it could solve a problem in just minutes, whereas a classical computer would be able to solve the same problem in about 10,000 years. While that sounds extraordinary, this ‘quantum advantage’ could be possible only in some very specific situations.
According to Peter Selinger, a mathematician and quantum-computing specialist at Dalhousie University in Halifax, Canada, computers will need many thousand qubits before they can widen their applications. “The stage of quantum computers now is something like classical computing in the late 1980s,” states Sara Metwalli, a researcher from Keio University, Tokyo, Japan. “Most of the work done now is to prove that quantum, in the future, may have the ability to solve interesting problems.
As the advancements continue to happen, IBM aspires to develop a 1,000-qubit machine by the end of 2023. But why is there a hiccup to create the ultimate quantum computer? The digital logic that powers a classical computer is no match for quantum computers. Quantum computers need logic that is much more fluid. According to Krysta Svore, principal manager of the quantum-computing group at Microsoft Research, “Quantum computing is essentially matric vector multiplication.” To understand more about the workings of quantum computing, IBM has created an interactive toolkit along with its Qiskit quantum language.
To expand the applications of quantum computing, scientists also need to understand quantum circuits, explains Jeannette Garcia, IBM Research. She adds that these circuits represent how qubits are transformed by logical gates, similar to the logic gates AND, OR, and NOT that comprise electronic circuits.
The Quantum Language
Microsoft, IBM, and Google have tools to their names that can help coders. Q#, Qiskit, and Cirq, respectively, are inspired by the Python programming language with user-friendly development environments.
Microsoft also has a quantum development kit (QDK), which has all the code libraries, a debugger, and a resource estimator that can tell how many qubits an algorithm requires. Not just that tech giants, Rigetti Computing in California also has its own 31-qubit machine and released a quantum software development kit known as Forest, and Cambridge Quantum Computing, based out of the UK, also launched tket with its own library.
The most recent release is last year’s Silq by the Swiss Federal Institute of Technology, which includes “uncomputation”, a language that can automatically reset the temporary values used by a quantum program.
Tech giants like IBM, Amazon, and Microsoft offer access to the hardware on certain terms. If a research organization wants to use IBM’s machines, it needs to become a part of its Quantum Network, which includes universities, laboratories, and companies. Microsoft also offers access to quantum computers via its new Azure Quantum platform. Of course, research institutions have to apply to become a member. Amazon allows researchers to use other firms’ quantum devices through Amazon Web Services, a cloud-computing platform. Experiencing quantum computing on classical machines is possible via emulators. Microsoft’s QDK has a built-in emulator that can simulate a 30-qubit machine on a laptop.
With more developments in this field, quantum computers will only advance but will not replace classical machines. Instead, they might come embedded in a classical machine with larger architecture that will allow it to solve those complex problems in a jiffy.
Share This Article
Do the sharing thingy