计算机科学
残余物
前馈
水准点(测量)
人工智能
网络体系结构
模式识别(心理学)
失败
卷积神经网络
过程(计算)
机器学习
算法
工程类
控制工程
操作系统
并行计算
计算机安全
大地测量学
地理
作者
Fei Wang,Mengqing Jiang,Chen Qian,Shuo Yang,Cheng Li,Honggang Zhang,Wei Wang,Xiaoou Tang
出处
期刊:Cornell University - arXiv
日期:2017-07-01
被引量:3206
标识
DOI:10.1109/cvpr.2017.683
摘要
In this work, we propose Residual Attention Network, a convolutional neural network using attention mechanism which can incorporate with state-of-art feed forward network architecture in an end-to-end training fashion. Our Residual Attention Network is built by stacking Attention Modules which generate attention-aware features. The attention-aware features from different modules change adaptively as layers going deeper. Inside each Attention Module, bottom-up top-down feedforward structure is used to unfold the feedforward and feedback attention process into a single feedforward process. Importantly, we propose attention residual learning to train very deep Residual Attention Networks which can be easily scaled up to hundreds of layers. Extensive analyses are conducted on CIFAR-10 and CIFAR-100 datasets to verify the effectiveness of every module mentioned above. Our Residual Attention Network achieves state-of-the-art object recognition performance on three benchmark datasets including CIFAR-10 (3.90% error), CIFAR-100 (20.45% error) and ImageNet (4.8% single model and single crop, top-5 error). Note that, our method achieves 0.6% top-1 accuracy improvement with 46% trunk depth and 69% forward FLOPs comparing to ResNet-200. The experiment also demonstrates that our network is robust against noisy labels.
科研通智能强力驱动
Strongly Powered by AbleSci AI