A Dense-Sparse Complementary Network for Human Action Recognition based on RGB and Skeleton Modalities

计算机科学 RGB颜色模型 人工智能 计算机视觉 卷积神经网络 杠杆(统计) 深度学习 模式识别(心理学)
作者
Cheng Qin,Jun Cheng,Zhen Liu,Ziliang Ren,Jianming Liu
出处
期刊:Expert Systems With Applications [Elsevier BV]
卷期号:244: 123061-123061 被引量:13
标识
DOI:10.1016/j.eswa.2023.123061
摘要

The vulnerability of RGB-based human action recognition in complex environment and variational scenes can be compensated by skeleton modality. Therefore, action recognition methods fusing RGB and skeleton modalities have received increasing attention. However, the recognition performance of the existing methods is still not satisfactory due to the insufficiently optimized sampling, modeling and fusion strategy, even the computational cost is heavy. In this paper, we propose a Dense-Sparse Complementary Network (DSCNet), which aims to leverage the complementary information of the RGB and skeleton modalities at light computational cost to obtain the competitive action recognition performance. Specifically, we first adopt dense and sparse sampling strategies according to the advantages of RGB and skeleton modalities, respectively. And then, we use the skeleton as guiding information to crop the key active region of the persons in the RGB frame, which largely eliminates the interference of the background. Moreover, a Short-Term Motion Extraction Module (STMEM) is proposed to compress the densely sampled RGB frames to fewer frames before feeding them into the backbone network, which avoids a surge in computational cost. And a Sparse Multi-Scale Spatial–Temporal convolutional neural Network (Sparse-MSSTNet) is designed to modeling sparse skeleton. Extensive experiments show that our method effectively combines complementary information of RGB and skeleton modalities to improve recognition accuracy. The DSCNet achieves competitive performance on NTU RGB+D 60, NTU RGB+D 120, PKU-MMD, UAV-human, IKEA ASM and Northwest-UCLA datasets with much less computational cost than exiting methods. The code is available at https://github.com/Maxchengqin/DSCNet.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
量子星尘发布了新的文献求助150
刚刚
1秒前
高高的冷之完成签到,获得积分10
1秒前
1秒前
2秒前
老实贞完成签到,获得积分10
2秒前
wanci应助倪妮采纳,获得10
2秒前
执着友易完成签到,获得积分10
2秒前
小羊学学学完成签到 ,获得积分10
2秒前
上官若男应助小星采纳,获得10
3秒前
4秒前
4秒前
LEESO发布了新的文献求助10
5秒前
哭泣静丹完成签到,获得积分10
6秒前
莱服不服发布了新的文献求助10
6秒前
hhhhh完成签到 ,获得积分10
6秒前
胡图图给胡图图的求助进行了留言
7秒前
陆转发布了新的文献求助10
8秒前
LNZHSY完成签到,获得积分10
9秒前
科研通AI5应助whosinge采纳,获得10
10秒前
yaphetsdyd发布了新的文献求助10
11秒前
朝闻道完成签到 ,获得积分10
12秒前
现实的艳一完成签到,获得积分20
14秒前
莱服不服完成签到,获得积分10
14秒前
温婉的眼神完成签到 ,获得积分10
15秒前
量子星尘发布了新的文献求助10
17秒前
17秒前
221完成签到,获得积分10
19秒前
TGU的小马同学完成签到 ,获得积分10
20秒前
喜之郎完成签到,获得积分10
20秒前
20秒前
浮游应助风格采纳,获得30
21秒前
陆转完成签到,获得积分10
21秒前
james完成签到 ,获得积分10
21秒前
guozizi发布了新的文献求助50
21秒前
22秒前
爆米花应助川奈天吾采纳,获得10
22秒前
张泽华发布了新的文献求助10
22秒前
24秒前
华仔应助孤独的谷秋采纳,获得10
26秒前
高分求助中
Comprehensive Toxicology Fourth Edition 24000
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Pipeline and riser loss of containment 2001 - 2020 (PARLOC 2020) 1000
World Nuclear Fuel Report: Global Scenarios for Demand and Supply Availability 2025-2040 800
Handbook of Social and Emotional Learning 800
Risankizumab Versus Ustekinumab For Patients with Moderate to Severe Crohn's Disease: Results from the Phase 3B SEQUENCE Study 600
Lloyd's Register of Shipping's Approach to the Control of Incidents of Brittle Fracture in Ship Structures 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 内科学 生物化学 物理 计算机科学 纳米技术 遗传学 基因 复合材料 化学工程 物理化学 病理 催化作用 免疫学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 5142489
求助须知:如何正确求助?哪些是违规求助? 4340748
关于积分的说明 13518132
捐赠科研通 4180674
什么是DOI,文献DOI怎么找? 2292542
邀请新用户注册赠送积分活动 1293241
关于科研通互助平台的介绍 1235752