1. Goertzel, B. Artificial general intelligence: concept, state of the art, and future prospects. J. Artif. Gen. Intell. 5, 1–48 (2014).

2. Benjamin, B. V. et al. Neurogrid: a mixed-analog-digital multichip system for large-scale neural simulations. Proc. IEEE 102, 699–716 (2014).

3. Merolla, P. A. et al. A million spiking-neuron integrated circuit with a scalable communication network and interface. Science 345, 668–673 (2014).

4. Furber, S. B. et al. The SpiNNaker project. Proc. IEEE 102, 652–665 (2014).

5. Schemmel, J. et al. A wafer-scale neuromorphic hardware system for large-scale neural modeling. In Proc. 2010 IEEE Int. Symposium on Circuits and Systems 1947–1950 (IEEE, 2010).

6. Davies, M. et al. Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 38, 82–99 (2018).

7. Chen, Y.-H. et al. Eyeriss: an energy-efficient reconfigurable accelerator for deep convolutional neural networks. IEEE J. Solid-State Circuits 52, 127–138 (2017).

8. Jouppi, N. P. et al. In-datacenter performance analysis of a tensor processing unit. In 2017 ACM/IEEE 44th Annual Int. Symposium on Computer Architecture 1–12 (IEEE, 2017).

9. Markram, H. The blue brain project. Nat. Rev. Neurosci. 7, 153–160 (2006).

10. Izhikevich, E. M. Simple model of spiking neurons. IEEE Trans. Neural Netw. 14, 1569–1572 (2003).

11. Eliasmith, C. et al. A large-scale model of the functioning brain. Science 338, 1202–1205 (2012).

12. Song, S., Miller, K. D. & Abbott, L. F. Competitive Hebbian learning through spike-timing-dependent synaptic plasticity. Nat. Neurosci. 3, 919–926 (2000).

13. Gusfield, D. Algorithms on Strings, Trees and Sequences: Computer Science and Computational Biology (Cambridge Univ. Press, 1997).

14. Qiu, G. Modelling the visual cortex using artificial neural networks for visual image reconstruction. In Fourth Int. Conference on Artificial Neural Networks 127–132 (Institution of Engineering and Technology, 1995).

15. LeCun, Y., Bengio, Y. & Hinton, G. Deep learning. Nature 521, 436–444 (2015).

16. Russell, S. J. & Norvig, P. Artificial Intelligence: A Modern Approach (Pearson Education, 2016).

17. He, K. et al. Deep residual learning for image recognition. In Proc. IEEE Conference on Computer Vision and Pattern Recognition 770–778 (IEEE, 2016).

18. Hinton, G. et al. Deep neural networks for acoustic modeling in speech recognition. IEEE Signal Process. Mag. 29, 82–97 (2012).

19. Young, T. et al. Recent trends in deep learning based natural language processing. IEEE Comput. Intell. Mag. 13, 55–75 (2018).

20. Silver, D. et al. Mastering the game of Go with deep neural networks and tree search. Nature 529, 484–489 (2016).

21. Lake, B. M. et al. Building machines that learn and think like people. Behav. Brain Sci. 40, e253 (2017).

22. Hassabis, D. et al. Neuroscience-inspired artificial intelligence. Neuron 95, 245–258 (2017).

23. Marblestone, A. H., Wayne, G. & Kording, K. P. Toward an integration of deep learning and neuroscience. Front. Comput. Neurosci. 10, 94 (2016).

24. Lillicrap, T. P. et al. Random synaptic feedback weights support error backpropagation for deep learning. Nat. Commun. 7, 13276 (2016).

25. Roelfsema, P. R. & Holtmaat, A. Control of synaptic plasticity in deep cortical networks. Nat. Rev. Neurosci. 19, 166–180 (2018).

26. Ullman, S. Using neuroscience to develop artificial intelligence. Science 363, 692–693 (2019).

27. Xu, K. et al. Show, attend and tell: neural image caption generation with visual attention. In Int. Conference on Machine Learning (eds Bach, F. & Blei, D.) 2048–2057 (International Machine Learning Society, 2015).

28. Zhang, B., Shi, L. & Song, S. in Brain-Inspired Robotics: The Intersection of Robotics and Neuroscience (eds Sanders, S. & Oberst, J.) 4–9 (Science/AAAS, 2016).

29. Sabour, S., Frosst, N. & Hinton, G. E. Dynamic routing between capsules. Adv. Neural Inf. Processing Syst. 30, 3856–3866 (2017).

30. Mi, Y. et al. Spike frequency adaptation implements anticipative tracking in continuous attractor neural networks. Adv. Neural Inf. Processing Syst. 27, 505–513 (2014).

31. Herrmann, M., Hertz, J. & Prügel-Bennett, A. Analysis of synfire chains. Network 6, 403–414 (1995).

32. London, M. & Häusser, M. Dendritic computation. Annu. Rev. Neurosci. 28, 503–532 (2005).

33. Imam, N. & Manohar, R. Address-event communication using token-ring mutual exclusion. In 2011 17th IEEE Int. Symposium on Asynchronous Circuits and Systems 99–108 (IEEE, 2011).

34. Deng, L. et al. GXNOR-Net: training deep neural networks with ternary weights and activations without full-precision memory under a unified discretization framework. Neural Netw. 100, 49–58 (2018).

35. Han, S. et al. EIE: efficient inference engine on compressed deep neural network. In 2016 ACM/IEEE 43rd Annual Int. Symposium on Computer Architecture 243–254 (IEEE, 2016).

36. Diehl, P. U. et al. Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In 2015 Int. Joint Conference on Neural Networks 1–8 (IEEE, 2015).

37. Wu, Y. et al. Spatio-temporal backpropagation for training high-performance spiking neural networks. Front. Neurosci. 12, 331 (2018).

38. Orchard, G. et al. Converting static image datasets to spiking neuromorphic datasets using saccades. Front. Neurosci. 9, 437 (2015).

39. Krizhevsky, A., Sutskever, I. & Hinton, G. E. ImageNet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 25, 1097–1105 (2012).

40. Simonyan, K. & Zisserman, A. Very deep convolutional networks for large-scale image recognition. In Int. Conference on Learning Representations; preprint at https://arxiv.org/pdf/1409.1556.pdf (2015).

41. Deng, J. et al. ImageNet: a large-scale hierarchical image database. In 2009 IEEE Conference on Computer Vision and Pattern Recognition 248–255 (IEEE, 2009).

42. LeCun, Y. et al. Gradient-based learning applied to document recognition. Proc. IEEE 86, 2278–2324 (1998).

43. Courbariaux, M., Bengio, Y. & David, J.-P. BinaryConnect: training deep neural networks with binary weights during propagations. Adv. Neural Inf. Processing Syst. 28, 3123–3131 (2015).

44. Krizhevsky, A. & Hinton, G. Learning Multiple Layers of Features from Tiny Images. MSc thesis, Univ. Toronto (2009).

45. Merity, S. et al. Pointer sentinel mixture models. In Int. Conference on Learning Representations; preprint at https://arxiv.org/abs/1609.07843 (2017).

46. Krakovna, V. & Doshi-Velez, F. Increasing the interpretability of recurrent neural networks using hidden Markov models. Preprint at https://arxiv.org/abs/1606.05320 (2016).

47. Wu, S. et al. Training and inference with integers in deep neural networks. In Int. Conference on Learning Representations; preprint at https://arxiv.org/abs/1802.04680 (2018).

48. Paszke, A. et al. Automatic differentiation in Pytorch. In Proc. NIPS Autodiff Workshop https://openreview.net/pdf?id=BJJsrmfCZ (2017).

50. Fowers, J. et al. A configurable cloud-scale DNN processor for real-time AI. In 2018 ACM/IEEE 45th Annual Int. Symposium on Computer Architecture 1–14 (IEEE, 2018).

51. Xu, M. et al. HMM-based audio keyword generation. In Advances in Multimedia Information Processing – PCM 2004, Vol. 3333 (eds Aizawa, K. et al.) 566–574 (Springer, 2004).

52. Mathis, A., Herz, A. V. & Stemmler, M. B. Resolution of nested neuronal representations can be exponential in the number of neurons. Phys. Rev. Lett. 109, 018103 (2012).

53. Gerstner, W. et al. Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition (Cambridge Univ. Press, 2014).

54. Liang, D. & Indiveri, G. Robust state-dependent computation in neuromorphic electronic systems. In IEEE Biomedical Circuits and Systems Conference 1–4 (IEEE, 2017).

55. Akopyan, F. et al. TrueNorth: design and tool flow of a 65 mw 1 million neuron programmable neurosynaptic chip. IEEE Trans. Comput. Aided Des. Integrated Circ. Syst. 34, 1537–1557 (2015).