MNIST数据库
计算机科学
人工神经网络
横杆开关
推论
尖峰神经网络
人工智能
背景(考古学)
深度学习
电子工程
拓扑(电路)
电气工程
工程类
电信
古生物学
生物
作者
J. Minguet Lopez,Quentin Rafhay,Manon Dampfhoffer,Lucas Reganaz,N. Castellani,V. Meli,Simon J. Martin,L. Grenouillet,G. Navarro,T. Magis,C. Carabasse,Tifenn Hirtzlin,Elisa Vianello,Damien Deleruyelle,Jean‐Michel Portal,G. Molas,F. Andrieu
标识
DOI:10.1002/aelm.202200323
摘要
Abstract Single memristor crossbar arrays are a very promising approach to reduce the power consumption of deep learning accelerators. In parallel, the emerging bio‐inspired spiking neural networks (SNNs) offer very low power consumption with satisfactory performance on complex artificial intelligence tasks. In such neural networks, synaptic weights can be stored in nonvolatile memories. The latter are massively read during inference, which can lead to device failure. In this context, a 1S1R (1 Selector 1 Resistor) device composed of a HfO 2 ‐based OxRAM memory stacked on a Ge‐Se‐Sb‐N‐based ovonic threshold switch (OTS) back‐end selector is proposed for high‐density binarized SNNs (BSNNs) synaptic weight hardware implementation. An extensive experimental statistical study combined with a novel Monte Carlo model allows to deeply analyze the OTS switching dynamics based on field‐driven stochastic nucleation of conductive dots in the layer. This allows quantifying the occurrence frequency of OTS erratic switching as a function of the applied voltages and 1S1R reading frequency. The associated 1S1R reading error rate is calculated. Focusing on the standard machine learning MNIST image recognition task, BSNN figures of merit (footprint, electrical consumption during inference, frequency of inference, accuracy, and tolerance to errors) are optimized by engineering the network topology, training procedure, and activations sparsity.
科研通智能强力驱动
Strongly Powered by AbleSci AI