模式(计算机接口)
尖峰神经网络
计算机科学
人工神经网络
神经科学
人工智能
心理学
人机交互
作者
Zhanghan Lin,Haiping Huang
出处
期刊:Physical review
日期:2024-08-13
卷期号:110 (2)
标识
DOI:10.1103/physreve.110.024306
摘要
Spiking neural networks play an important role in brainlike neuromorphic computations and in studying working mechanisms of neural circuits. One drawback of training a large-scale spiking neural network is that updating all weights is quite expensive. Furthermore, after training, all information related to the computational task is hidden into the weight matrix, prohibiting us from a transparent understanding of circuit mechanisms. Therefore, in this work, we address these challenges by proposing a spiking mode-based training protocol, where the recurrent weight matrix is explained as a Hopfield-like multiplication of three matrices: input modes, output modes, and a score matrix. The first advantage is that the weight is interpreted by input and output modes and their associated scores characterizing the importance of each decomposition term. The number of modes is thus adjustable, allowing more degrees of freedom for modeling the experimental data. This significantly reduces the training cost because of significantly reduced space complexity for learning. Training spiking networks is thus carried out in the mode-score space. The second advantage is that one can project the high-dimensional neural activity (filtered spike train) in the state space onto the mode space which is typically of a low dimension, e.g., a few modes are sufficient to capture the shape of the underlying neural manifolds. We successfully apply our framework in two computational tasks-digit classification and selective sensory integration tasks. Our method thus accelerates the training of spiking neural networks by a Hopfield-like decomposition, and moreover this training leads to low-dimensional attractor structures of high-dimensional neural dynamics.
科研通智能强力驱动
Strongly Powered by AbleSci AI