Will quantum computing change deep learning? Given how leading tech giants like IBM and Google, among others, are racing to make it a reality, researchers are working hard to repurpose QC to accelerate deep learning algorithms RNN and CNN. In fact, QC can be applied to all the three tasks:

Simulation Optimisation Sampling

As one researcher famously put it, quantum computing will disrupt our path to AI and deep learning. There is a lot of ongoing research in Quantum AI with giant techs and companies like Rigetti introducing full stack QC. It has almost garnered the place of a primary research area what with NASA and Google setting up Quantum Artificial Intelligence Labs as early as in 2013.

Let’s See How QC + AI Can Be A Reality

According to QC Ware, standard feedforward, convolutional, and recurrent neural networks have seen widespread adoption in many different areas of ML over the last five years. However, a key obstacle is in seeing how quantum computers can either accelerate the training or inference stages of neural networks and/or be used to improve accuracy. Some of the key questions discussed by DWave Federal, a US subsidiary which provides DWave’s QC systems to the US government, are related to QC and how it can be used to advance deep learning and AI.

Why is QC required for AI and how does it overcome the performance deficiencies of classical computing resources?

Which AI and ML problems can be addressed by QC?

How will QC be built into AI or ML computing workflows?

How Can Quantum Computing Speed Up AI?

Theoretical computer scientist Scott Aaronson questions this fact in his research paper, titled Quantum Machine Learning Algorithms: Read the Fine Print. “The new algorithms provide a general template, showing how quantum computers might be used to provide exponential speedups for central problems like clustering, pattern-matching, and principal component analysis,” he writes. While it is true that QC research, can enable researchers to solve any classical problems exponentially faster than today’s computers seem able to solve them, the new algorithms only provide a general template, depicting how QC can lead to speedups for tasks like clustering, pattern-matching and analysis.

However, there are a few roadblocks to this approach as well. For example, classical data can be leveraged with classical algorithms and to use quantum algorithms, classical data should be converted to quantum data online. One of the techniques suggested by a researcher is to use reinforcement training where a quantum agent interacts with a classical environment. Also, for QC to work on AI problems, quantum processors will have to be integrated with the current technology stack. For example, so far the ML frameworks used are TensorFlow, Caffe, Torch, Keras and Theano. In the future, we will see quantum hardware along with APIs and languages integrated into the current system. Canada’s D-Wave Systems largely deploys quantum annealing which makes use of quantum fluctuations to find global minimum in a function.

Challenges In Using QC For AI

Dr Amit Ray observes in his book that integrating QC with neural networks is an area which is being actively researched. There are new developments in quantum learning algorithms as well. Case in point are Quantum Convolutional Neural Nets (QCNN) and Quantum Reinforcement Neural Network (QRNN) where these algorithms are trained through gradient descent to carry out generalisation of classical problems.

California-based Rigetti Computing uses one of its prototype quantum chips to understand clustering algorithm. Clustering is an ML technique used to organise data into similar groups. Xanadu.ai is conducting considerable research in QC and believes that ML and Quantum Machine Learning (QML) will become more pervasive and will redefine the way researchers think about QC. The company’s blog emphasises that quantum technologies will overhaul the new AI hardware scene as well and hardware will shape the future of quantum as well. Besides fueling current ML techniques, QC can also pave the way for new ML models.

If you loved this story, do join our Telegram Community.



Also, you can write for us and be one of the 500+ experts who have contributed stories at AIM. Share your nominations here.