计算机科学
频道(广播)
机制(生物学)
转化(遗传学)
信息传递
编码(集合论)
空间分析
算法
人工智能
计算机网络
电信
数学
基因
统计
认识论
哲学
生物化学
集合(抽象数据类型)
化学
程序设计语言
作者
Jie Pan,Haigen Hu,Aizhu Liu,Qianwei Zhou,Guan Qin
标识
DOI:10.1109/icpr56361.2022.9956146
摘要
Attention is one of the most valuable breakthroughs in the deep learning community, and how to effectively utilize the attention information of channel and spatial is still one of the hot research topics. In this work, we integrate the advantages of channel and spatial mechanism to propose a Channel-Spatial hybrid Attention Module (CSHAM). Specifically, max-average fusion Channel Attention Module and Spatial Attention Neighbor Enhancement Module are firstly proposed, respectively. Then the connection between the two modules is analyzed and designed, and an alternate connection strategy with the transformation of channel weights is proposed. The key idea is to repeatedly use the channel weight information generated by the channel attention module, and to reduce the negative impact of the network complexity caused by the addition of the attention mechanism. Finally, a series of comparison experiments are conducted on CIFAR100 and Caltech-101 based on various backbone models. The results show that the proposed methods can obtain the best Top-1 performance among the existing popular methods, and can rise by nearly 1% in accuracy while basically maintaining the parameters and FLOPs. The code is publicly available at https://github.com/HuHaigen/A-Channel-Spatial-Hybrid-Attention-Mechanism-using-Channel-Weight-Transfer-Strategy. The package includes the proposed CSHAM for reproducibility purposes.
科研通智能强力驱动
Strongly Powered by AbleSci AI