记忆电阻器
计算机科学
横杆开关
人工神经网络
深度学习
记忆晶体管
浮点型
电压
算法
人工智能
计算机硬件
电子工程
电阻随机存取存储器
电气工程
工程类
电信
作者
Jiadong Chen,Shiping Wen,Kaibo Shi,Yin Yang
标识
DOI:10.1016/j.neunet.2021.09.016
摘要
At present, in the new hardware design work of deep learning, memristor as a non-volatile memory with computing power has become a research hotspot. The weights in the deep neural network are the floating-point number. Writing a floating-point value into a memristor will result in a loss of accuracy, and the writing process will take more time. The binarized neural network (BNN) binarizes the weights and activation values that were originally floating-point numbers to +1 and -1. This will greatly reduce the storage space consumption and time consumption of programming the resistance value of the memristor. Furthermore, this will help to simplify the programming of memristors in deep neural network circuits and speed up the inference process. This paper provides a complete solution for implementing memristive BNN. Furthermore, we improved the design of the memristor crossbar by converting the input feature map and kernel before performing the convolution operation that can ensure the sign of the input voltage of each port constant. Therefore, we do not need to determine the sign of the input voltage required by the port in advance which simplifies the process of inputting the feature map elements to each port of the crossbar in the form of voltage. At the same time, in order to ensure that the output of the current convolution layer can be directly used as the input of the next layer, we have added a corresponding processing circuit, which integrates batch-normalization and binarization operations.
科研通智能强力驱动
Strongly Powered by AbleSci AI