尖峰神经网络
计算机科学
人工神经网络
神经形态工程学
循环神经网络
人工智能
机器学习
领域(数学分析)
模式识别(心理学)
数学
数学分析
作者
Bojian Yin,Federico Corradi,Sander M. Bohté
标识
DOI:10.1038/s42256-021-00397-w
摘要
Inspired by detailed modelling of biological neurons, spiking neural networks (SNNs) are investigated as biologically plausible and high-performance models of neural computation. The sparse and binary communication between spiking neurons potentially enables powerful and energy-efficient neural networks. The performance of SNNs, however, has remained lacking compared with artificial neural networks. Here we demonstrate how an activity-regularizing surrogate gradient combined with recurrent networks of tunable and adaptive spiking neurons yields the state of the art for SNNs on challenging benchmarks in the time domain, such as speech and gesture recognition. This also exceeds the performance of standard classical recurrent neural networks and approaches that of the best modern artificial neural networks. As these SNNs exhibit sparse spiking, we show that they are theoretically one to three orders of magnitude more computationally efficient compared to recurrent neural networks with similar performance. Together, this positions SNNs as an attractive solution for AI hardware implementations. The use of sparse signals in spiking neural networks, modelled on biological neurons, offers in principle a highly efficient approach for artificial neural networks when implemented on neuromorphic hardware, but new training approaches are needed to improve performance. Using a new type of activity-regularizing surrogate gradient for backpropagation combined with recurrent networks of tunable and adaptive spiking neurons, state-of-the-art performance for spiking neural networks is demonstrated on benchmarks in the time domain.
科研通智能强力驱动
Strongly Powered by AbleSci AI