Nature News

In direction of a basic synthetic intelligence with a Tianjic hybrid chip structure

1.

Goertzel, B. Common synthetic intelligence: idea, state-of-the-art and future prospects. J. Artif. Gen. Intell. 5, 1-48 (2014).

2

Benjamin, B.V. et al. Neurogrid: Blended analog-digital multi-chip system for large-scale neural simulations. Proc. IEEE 102, 699-716 (2014).

three

Merolla, P.A. et al. A million spiking-neuron ICs with an evolving communications community and interface. Science 345, 668-673 (2014).

Four

Furber, S.B. et al. The SpiNNaker challenge. Proc. IEEE 102, 652-665 (2014).

5

Schemmel, J. et al. Neuromorphic materials system on the scale of a slice for giant scale neural modeling. In Proc. 2010 IEEE Int. Symposium on Circuits and Methods, 1947-1950 (IEEE, 2010).

6

Davies, M. et al. Loihi: a multicore neuromorphic processor with on-chip studying. IEEE Micro 38, 82-99 (2018).

7.

Chen, Y.-H. et al. Eyeriss: vitality environment friendly reconfigurable accelerator for deep convection neural networks. IEEE J. Semiconductor circuits 52, 127-138 (2017).

Eight

Jouppi, N. P. et al. Efficiency evaluation in a knowledge middle of a tensor processing unit. In 2017, ACM / IEEE 44th Annual Int. Symposium on Pc Structure 1-12 (IEEE, 2017).

9

Markram, H. The blue mind challenge. Nat. Rev. Neurosci. 7, 153-160 (2006).

ten.

Izhikevich, E. M. Easy mannequin of doping neurons. IEEE Trans. Neural Netw. 14, 1569-1572 (2003).

11

Eliasmith, C. et al. A big-scale mannequin of the mind in operation. Science 338, 1202-1205 (2012).

12

Tune, S., Miller, Okay.D. and Abbott, L.F. Aggressive Hebbian studying as a result of synaptic plasticity relying on the timing peak. Nat. Neurosci. three, 919-92 (2000).

13

Gusfield, D. Algorithms on Chains, Bushes and Sequences: Pc Science and Computational Biology (Cambridge Univ Press, 1997).

14

Qiu, G. Visible cortex modeling utilizing synthetic neural networks for reconstruction of visible pictures. In fourth int. Convention on Synthetic Neural Networks 127-132 (Establishment of Engineering and Expertise, 1995).

15

LeCun, Y., Bengio, Y. and Hinton, G. Deep Studying. Nature 521, 436-444 (2015).

16

Russell, S.J. and Norvig, P. Synthetic Intelligence: A Fashionable Strategy (Pearson Schooling, 2016).

17

He, Okay. et al. Deep residual studying for picture recognition. In Proc. IEEE Convention on Pc Imaginative and prescient and Sample Recognition 770-778 (IEEE, 2016).

18

Hinton, G. et al. Deep neural networks for acoustic modeling in speech recognition. IEEE sign course of. Magazine. 29, 82-97 (2012).

19

Younger, T. et al. Current traits in pure language processing by in-depth studying. IEEE Comput. Intell. Magazine. 13, 55-75 (2018).

20

Silver, D. et al. Grasp the sport of Go together with deep neural networks and tree analysis. Nature 529, 484-489 (2016).

21

Lake, B.M. et al. Construct machines that study and assume like folks. Habits Mind Sci. 40, e253 (2017).

22

Hassabis, D. et al. Synthetic intelligence impressed by neuroscience. Neuron 95, 245-258 (2017).

23

Marblestone, A.H., Wayne, G. and Kording, Okay.P. In direction of an integration of in-depth studying and neuroscience. Entrance. Comput. Neurosci. 10, 94 (2016).

24

Lillicrap, T. P. et al. Random synaptic suggestions weights help the propagation of error for deep studying. Nat. Widespread. 7, 13276 (2016).

25

Roelfsema, P. R. & Holtmaat, A. Management of synaptic plasticity in deep cortical networks. Nat. Rev. Neurosci. 19, 166-180 (2018).

26

Ullman, S. Use of neuroscience to develop synthetic intelligence. Science 363, 692-693 (2019).

27

Xu, Okay. et al. Present, attend and inform: era of neural picture legend with visible consideration. Int. Lecture on Machine Studying (below the route of Bach, F. & Blei, D.) 2048-2057 (Worldwide Machine Studying Society, 2015).

28

Zhang, B., Shi, L. and Tune, S. in Mind-Impressed Robotics: The Intersection of Robotics and Neuroscience (eds Sanders, S. and Oberst, J.) Four-9 (Science / AAAS, 2016).

29

Sabour, S., Frosst, N. & Hinton, G. E. Dynamic routing between capsules. Adv. Neural Inf. System Therapy 30, 3856-3866 (2017).

30

Mi, Y. et al. Peak Frequency Adaptation implements anticipatory monitoring in steady attractor neural networks. Adv. Neural Inf. System Therapy 27, 505-513 (2014).

31.

Herrmann, M., Hertz, J & Prügel-Bennett, A. Evaluation of synchro-light chains. Community 6, 403-414 (1995).

32

London, M. & Häusser, M. Dendritic calculus. Annu. Rev. Neurosci. 28, 503-532 (2005).

33

Imam, N. & Manohar, R. Occasion-event communication with the help of a mutual token ring exclusion. In 2011 17th IEEE Int. Symposium on Asynchronous Circuits and Methods 99-108 (IEEE, 2011).

34

Deng, L. et al. GXNOR-Internet: formation of deep neural networks with ternary weights and excessive precision non-memory activations in a unified discretization framework. Neural Netw. 100, 49-58 (2018).

35

Han, S. et al. EIA: efficient inference motor on a deep compressed neural community. In 2016, ACM / IEEE 43rd Annual Int. Symposium on Pc Structure 243-254 (IEEE, 2016).

36

Diehl, P. U. et al. Fast classification and excessive precision of deep networks due to weight balancing and thresholds. In 2015 Int. Joint Convention on Neural Networks 1-Eight (IEEE, 2015).

37

Wu, Y. et al. Spatio-temporal backpropagation for the coaching of excessive efficiency spiking neural networks. Entrance. Neurosci. 12, 331 (2018).

38

Orchard, G. et al. Changing static picture datasets to complement neuromorphic datasets with jerks. Entrance. Neurosci. 9, 437 (2015).

39

Krizhevsky, A., Sutskever, I. & Hinton, G. E. ImageNet classification with deep convolutional neural networks. Adv. Neural Inf. Course of. Syst. 25, 1097-1105 (2012).

40

Simonyan, Okay. & Zisserman, A. Very deep convolutional networks for large-scale picture recognition. Int. Lecture on Studying Representations; preprint at https://arxiv.org/pdf/1409.1556.pdf (2015).

41

Deng, J. et al. ImageNet: a large-scale hierarchical picture database. In 2009, IEEE Convention on Pc Imaginative and prescient and Sample Recognition 248-255 (IEEE, 2009).

42

LeCun, Y. et al. Gradient studying utilized to doc recognition. Proc. IEEE 86, 2278-2324 (1998).

43

Courbariaux, M., Bengio, Y. and David, J.-P. BinaryConnect: formation of deep neural networks with binary weights throughout propagations. Adv. Neural Inf. System Therapy 28, 3123-331 (2015).

44

Krizhevsky, A. & Hinton, G. Study a number of layers of options from tiny photos. Grasp's Thesis, Univ. Toronto (2009).

45

Merity, S. et al. Sentinel combine fashions Pointer. Int. Lecture on Studying Representations; pre-print on https://arxiv.org/abs/1609.07843 (2017).

46

Krakovna, V. & Doshi-Velez, F. Bettering the Interpretability of Recurrent Neural Networks Utilizing Hidden Markov Fashions. Preprint on https://arxiv.org/abs/1606.05320 (2016).

47

Wu, S. et al. Formation and inference with integers in deep neural networks. Int. Lecture on Studying Representations; pre-print on https://arxiv.org/abs/1802.04680 (2018).

48.

Paszke, A. et al. Automated differentiation in Pytorch. In Proc. NIPS Autodiff Workshop https://openreview.internet/pdf?id=BJJsrmfCZ (2017).

49

Narang, S. and Diamos, G. Baidu DeepBench. https://github.com/baidu-research/DeepBench (2017).

50

Fowers, J. et al. A configurable DNN processor within the cloud for real-time AI. In 2018, ACM / IEEE 45th Annual. Symposium on Pc Structure 1-14 (IEEE, 2018).

51.

Xu, M. et al. Era of audio key phrases primarily based on HMM. Progress within the Processing of Multimedia Data – PCM 2004, Vol. 3333 (Aizawa eds., Okay. et al.) 566-574 (Springer, 2004).

52.

Mathis, A., Herz, VA, and Stemmler, M. B. The decision of nested neuronal representations might be exponential in variety of neurons. Phys. Rev. Lett. 109, 018103 (2012).

53

Gerstner, W. et al. Neuronal Dynamics: Neurons Distinctive to Cognitive Networks and Fashions (Cambridge Univ Press, 2014).