期刊:IEEE Internet of Things Journal [Institute of Electrical and Electronics Engineers] 日期:2024-04-26卷期号:11 (15): 26314-26328被引量:10
标识
DOI:10.1109/jiot.2024.3394244
摘要
Intelligent wearable systems have been widely used in health monitoring, motion tracking, and engineering safety. However, the single function of current wearable systems cannot satisfy the requirements of complex scenarios, and the wearable systems cannot establish a relationship with the virtual 3D visualization platform. To address these issues, this paper proposes a novel intelligent wearable system with motion and emotion recognition. Multiple sensors are integrated into the system to collect motion and emotion information. In order to achieve accurate classification and recognition of multiple sensor information, we propose a novel human action recognition network called the three-branch spatial-temporal feature extraction network (TB-SFENet), which can obtain more robust features and achieve an accuracy of 97.04% on the UCI-HAR dataset and 92.68% on the UniMiB SHAR dataset. To establish the relationship between the real entity and virtual space, we use digital twin (DT) technology to establish the 3D display DT platform. The platform enables real-time information interaction, such as activity, emotion, location, and monitoring information. Additionally, we establish the TGAM electroencephalogram emotion classification (TEEC) dataset, which contains 120,000 pieces of data, for the proposed system. Experimental results indicate that the proposed system realizes virtual reality information interaction between the personal digital human and actual person based on the intelligent wearable system, which has great potential for applications in intelligent healthcare, virtual reality, and other fields.