Self-Growing Binary Activation Network: A Novel Deep Learning Model With Dynamic Architecture

MNIST数据库 计算机科学 建筑 人工智能 天基建筑 冗余(工程) 网络体系结构 深度学习 人工神经网络 二进制数 过程(计算) 任务(项目管理) 功能(生物学) 机器学习 参考体系结构 软件体系结构 软件 工程类 数学 艺术 计算机安全 算术 系统工程 进化生物学 视觉艺术 生物 程序设计语言 操作系统
作者
Ze-Yang Zhang,Yidong Chen,Changle Zhou
出处
期刊:IEEE transactions on neural networks and learning systems [Institute of Electrical and Electronics Engineers]
卷期号:35 (1): 624-633 被引量:4
标识
DOI:10.1109/tnnls.2022.3176027
摘要

For a deep learning model, the network architecture is crucial as a model with inappropriate architecture often suffers from performance degradation or parameter redundancy. However, it is experiential and difficult to find the appropriate architecture for a certain application. To tackle this problem, we propose a novel deep learning model with dynamic architecture, named self-growing binary activation network (SGBAN), which can extend the design of a fully connected network (FCN) progressively, resulting in a more compact architecture with higher performance on a certain task. This constructing process is more efficient than neural architecture search methods that train mass of networks to search for the optimal one. Concretely, the training technique of SGBAN is based on the function-preserving transformations that can expand the architecture and combine the information in the new data without neglecting the knowledge learned in the previous steps. The experimental results on four different classification tasks, i.e., Iris, MNIST, CIFAR-10, and CIFAR-100, demonstrate the effectiveness of SGBAN. On the one hand, SGBAN achieves competitive accuracy when compared with the FCN composed of the same architecture, which indicates that the new training technique has the equivalent optimization ability as the traditional optimization methods. On the other hand, the architecture generated by SGBAN achieves 0.59% improvements of accuracy, with only 33.44% parameters when compared with the FCNs composed of manual design architectures, i.e., 500+150 hidden units, on MNIST. Furthermore, we demonstrate that replacing the fully connected layers of the well-trained VGG-19 with SGBAN can gain a slightly improved performance with less than 1% parameters on all these tasks. Finally, we show that the proposed method can conduct the incremental learning tasks and outperform the three outstanding incremental learning methods, i.e., learning without forgetting, elastic weight consolidation, and gradient episodic memory, on both the incremental learning tasks on Disjoint MNIST and Disjoint CIFAR-10.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
small发布了新的文献求助10
2秒前
桐桐应助颦颦采纳,获得10
2秒前
xiaokezhang发布了新的文献求助10
2秒前
3秒前
称心剑鬼完成签到,获得积分10
3秒前
5秒前
武广敏完成签到,获得积分10
5秒前
5秒前
6秒前
6秒前
坚强的严青应助木头杨采纳,获得30
8秒前
8秒前
恰你眉目如昨完成签到 ,获得积分0
9秒前
111完成签到,获得积分10
12秒前
hua发布了新的文献求助10
12秒前
薛定谔的猫完成签到,获得积分10
12秒前
李燕鑫发布了新的文献求助10
14秒前
飞快的小熊猫完成签到,获得积分10
15秒前
暖暖完成签到 ,获得积分10
15秒前
16秒前
光年完成签到,获得积分10
16秒前
橙汁摇一摇完成签到 ,获得积分10
17秒前
17秒前
ExtroGod发布了新的文献求助10
18秒前
hua完成签到,获得积分10
18秒前
18秒前
why发布了新的社区帖子
19秒前
書架完成签到,获得积分10
19秒前
21秒前
忧心的若云完成签到,获得积分10
21秒前
elfff发布了新的文献求助10
22秒前
风中香发布了新的文献求助20
23秒前
qwt应助李燕鑫采纳,获得10
24秒前
Jasper应助李燕鑫采纳,获得10
24秒前
依人如梦完成签到 ,获得积分10
24秒前
个性的大白菜真实的钥匙完成签到 ,获得积分10
25秒前
26秒前
如约而至发布了新的文献求助10
27秒前
zjiang完成签到 ,获得积分10
27秒前
高分求助中
Evolution 10000
ISSN 2159-8274 EISSN 2159-8290 1000
Becoming: An Introduction to Jung's Concept of Individuation 600
Distribution Dependent Stochastic Differential Equations 500
A new species of Coccus (Homoptera: Coccoidea) from Malawi 500
A new species of Velataspis (Hemiptera Coccoidea Diaspididae) from tea in Assam 500
PraxisRatgeber: Mantiden: Faszinierende Lauerjäger 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3159473
求助须知:如何正确求助?哪些是违规求助? 2810505
关于积分的说明 7888418
捐赠科研通 2469473
什么是DOI,文献DOI怎么找? 1314873
科研通“疑难数据库(出版商)”最低求助积分说明 630722
版权声明 602012