特征(语言学)
计算机科学
块(置换群论)
卷积神经网络
模式识别(心理学)
编码(集合论)
人工智能
算法
数学
哲学
语言学
几何学
集合(抽象数据类型)
程序设计语言
作者
Sanghyun Woo,Jongchan Park,Joon‐Young Lee,In So Kweon
出处
期刊:Cornell University - arXiv
日期:2018-01-01
被引量:740
标识
DOI:10.48550/arxiv.1807.06521
摘要
We propose Convolutional Block Attention Module (CBAM), a simple yet effective attention module for feed-forward convolutional neural networks. Given an intermediate feature map, our module sequentially infers attention maps along two separate dimensions, channel and spatial, then the attention maps are multiplied to the input feature map for adaptive feature refinement. Because CBAM is a lightweight and general module, it can be integrated into any CNN architectures seamlessly with negligible overheads and is end-to-end trainable along with base CNNs. We validate our CBAM through extensive experiments on ImageNet-1K, MS~COCO detection, and VOC~2007 detection datasets. Our experiments show consistent improvements in classification and detection performances with various models, demonstrating the wide applicability of CBAM. The code and models will be publicly available.
科研通智能强力驱动
Strongly Powered by AbleSci AI