This what I've seen in quantum tech this month: 00 💥 Focus: Quantum Machine Learning

01 🗞 Tech News

10 📰 Research Highlights

11 🎲 Bonus Links 00 💥 Focus: Quantum Machine Learning

2017, the boom of quantum machine learning: Although the first quantum algorithms for machine learning have been proposed as far back as in the late 90s, this research subfield is now attracting an ever increasing interest from the quantum tech research community. As several articles on the topic are uploaded each week, there is an effort to grasp the reaches of the new discipline, with multiple reviews appearing recently such as a recent one that even provides a discussion on the meaning of quantum-enhanced artificial intelligence. Link







Still, there are a number of serious theoretical challenges to tackle as we are just starting to scratch the surface, including two fundamental bottlenecks known as the input and the output problem. The "classical" data needs to be "quantized" in order to be fed to the quantum computer. Loading a quantum random access memory (qRAM) seems a somewhat expensive task, and moreover the quantum output needs to be retrieved. Once such interconversions are taken into account, the promised advantages might be nullified by the overhead. A succinct but technical overview of the current state of the art is published in Nature.



For these reasons, the most promising machine learning applications for the first generation of quantum machines might be the study and optimisation of problems that are intrinsically quantum, such as in quantum chemistry and dynamics, where one plays with quantum data from the starting point.



Much remains to be seen, due to the critical difference between quantum and classical machine learning state of affairs: while engineers are obtaining impressive results just by tinkering with neural networks, even if the machines' rationale is hard to interpret, we are still waiting for the deployment of quantum machines harnessing the power of quantum algorithms. Meanwhile, standard machine learning techniques are applied to optimise quantum experiments and its paradigms are starting to flow into quantum information theory. The central idea underpinning quantum machine learning is simple: since quantum machines promise the advantage of carrying out some computations in a completely unconventional way and potentially exponentially faster, it would be desirable to exploit them to broaden the tools of standard machine learning. The proposition comes with promising indications, such as that quantum machine learning subroutines can solve a special set of linear algebra systems with far less steps, as well as fascinating possibilities, like exploring the effect of artificially injected quantum noise in data analysis, as this might play a crucial role similarly to noise in machine learning in the classical domain.Still, there are a number of serious theoretical challenges to tackle as we are just starting to scratch the surface, including two fundamental bottlenecks known as the input and the output problem. The "classical" data needs to be "quantized" in order to be fed to the quantum computer. Loading a quantum random access memory (qRAM) seems a somewhat expensive task, and moreover the quantum output needs to be retrieved. Once such interconversions are taken into account, the promised advantages might be nullified by the overhead. A succinct but technical overview of the current state of the art is published in Nature. Link For these reasons, the most promising machine learning applications for the first generation of quantum machines might be the study and optimisation of problems that are intrinsically quantum, such as in quantum chemistry and dynamics, where one plays with quantum data from the starting point.Much remains to be seen, due to the critical difference between quantum and classical machine learning state of affairs: while engineers are obtaining impressive results just by tinkering with neural networks, even if the machines' rationale is hard to interpret, we are still waiting for the deployment of quantum machines harnessing the power of quantum algorithms. Meanwhile, standard machine learning techniques are applied to optimise quantum experiments and its paradigms are starting to flow into quantum information theory.



01 🗞 Tech News





Satya Nadella set quantum computing on par with artificial intelligence and mixed reality as a major R&D line guiding the evolution of Microsoft, mentioning it in his keynote speech at the Ignite conference Microsoft is officially investing in an experimental route to quantum computing, supporting Charles Marcus at the University of Copenhagen. The news has been known for some time and is now official. The company has focused much of its research on an unconventional theoretical approach, based on the beautiful mathematical properties of "non-Abelian anyons", quasi-particles that are expected to appear in two-dimensional systems. Experimental research along this route is currently at the nascent stage. Link Satya Nadella set quantum computing on par with artificial intelligence and mixed reality as a major R&D line guiding the evolution of Microsoft, mentioning it in his keynote speech at the Ignite conference In a joint inter view with Bill Gates, they both confessed to be baffled by quantum computing. and in his recent autobiography, 'Hit Refresh'.view with Bill Gates, they both confessed to be baffled by quantum computing. Link



Videos from the VC-backed workshop held in Munich last June, moderated by The Economist's Jason Palmer, including a talk by IonQ CEO David Moehring. Link



10 📰 Research Highlights

The high-level steering committee for the EU quantum flagship released its final report, setting its recommendation for the structure and strategy of the impending programme. Differently from previous EU flagships, the €1 bln technology transfer project will not be based on a closed consortium model. The only area in which the experts expect a high level of technology readiness within three years is quantum communication.



Researchers led by Jian-Wei Pan have shared quantum encrypted data over a commercial optical fibre stretching over 60 km of metropolitan and intercity network.



A recent preprint by researchers at Google, UC Santa Barbara, and NASA provides a blueprint for the quantum supremacy quest the team is pursuing. The leader of the Google team announced at a recent conference that their latest chip holds over 20 qubits, on track with their quest to reach about 50 qubits later next year.



The scalability of optimization in quantum annealers such as DWave is hampered by temperature.



Together with the quantum machine learning review, Nature published several articles on quantum tech in a special issue, including a comment on open source software, reviews on fault-tolerant quantum computing and on post-quantum cryptography.



Classical algorithms are better than expected at simulating boson sampling, already reproducing the results that would be obtained by intertwining the path of 30 photons. As from such The high-level steering committee for the EU quantum flagship released its final report, setting its recommendation for the structure and strategy of the impending programme. Differently from previous EU flagships, thebln technology transfer project will not be based on a closed consortium model. The only area in which the experts expect a high level of technology readiness within three years is quantum communication. Link Researchers led by Jian-Wei Pan have shared quantum encrypted data over a commercial optical fibre stretching over 60 km of metropolitan and intercity network. Link A recent preprint by researchers at Google, UC Santa Barbara, and NASA provides a blueprint for the quantum supremacy quest the team is pursuing. The leader of the Google team announced at a recent conference that their latest chip holds over 20 qubits, on track with their quest to reach about 50 qubits later next year. Link The scalability of optimization in quantum annealers such as DWave is hampered by temperature. Link Together with the quantum machine learning review, Nature published several articles on quantum tech in a special issue, including a comment on open source software, reviews on fault-tolerant quantum computing and on post-quantum cryptography. Link quantum interference experiment one can infer the properties of special matrices, this result raises the bar for future quantum advantage claims going in this direction. Link



A conference on quantum computing will be held in Spain in February 2018. It will be attended by representatives from almost all current commercial quantum computing stakeholders – Google, IonQ, DWave, RIgetti, IBM, Microsoft. Link



A single qubit encoded in an ion can hold its memory for over ten minutes.



Bose Einstein condensate will be sent to space. A single qubit encoded in an ion can hold its memory for over ten minutes. Link Bose Einstein condensate will be sent to space. Links



11 🎲 Bonus Links





Scott Aaronson's long blog post: Big Numbers. The sound of quantum. Link Scott Aaronson's long blog post: Big Numbers. Link



John Preskill's blackboard lecture: Quantum tech as a basic research frontier. Link



Liked what you read? Please share it or subscribe here: eepurl.com/cSiixT