手势
计算机科学
手势识别
学习迁移
人工智能
语音识别
主题(文档)
模式(计算机接口)
计算机视觉
人机交互
图书馆学
作者
Yue Lian,Zongxing Lu,Xin Huang,Qican Shangguan,Ligang Yao,Jie Huang,Zhoujie Liu
出处
期刊:IEEE Sensors Journal
[Institute of Electrical and Electronics Engineers]
日期:2024-04-01
卷期号:24 (10): 17183-17192
标识
DOI:10.1109/jsen.2024.3382040
摘要
The hand gesture recognition (HGR) technology in A-mode Ultrasound Human-Machine Interface (HMI-A), based on traditional machine learning, relies on intricate feature reduction methods. Researchers need prior knowledge and multiple validations to achieve the optimal combination of features and machine learning algorithms. Furthermore, anatomical differences in the forearm muscles among different subjects prevent specific subject models from applying to unknown subjects, necessitating repetitive retraining. This increases users' time costs and limits the real-world application of HMI-A. Hence, this paper presents a lightweight one dimensional four branch squeeze-excitation convolutional neural network (4-branch SENet) that outperforms traditional machine learning methods in both feature extraction and gesture classification. Building upon this, a weight fine-tuning strategy using transfer learning enables rapid gesture recognition across subjects and time. Comparative analysis indicates that the freeze feature and fine-tuning fully connected layers result in an average accuracy of 96.35% ± 3.04% and an average runtime of 4.8s ± 0.15s, making it 52.9% faster than subject-specific models. This method further extends the application scenarios of HMI-A in fields such as medical rehabilitation and intelligent prosthetics.
科研通智能强力驱动
Strongly Powered by AbleSci AI