计算机科学
虚拟实境
人机交互
可用性
虚拟现实
面子(社会学概念)
多媒体
混合现实
社会科学
社会学
作者
Zhihan Lyu,Mikael Fridenfalk
出处
期刊:IEEE Transactions on Visualization and Computer Graphics
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:: 1-10
标识
DOI:10.1109/tvcg.2024.3372055
摘要
This work aims to pioneer the development of a real-time interactive and immersive Metaverse Human-Computer Interaction (HCI) system leveraging Virtual Reality (VR). The system incorporates a three-dimensional (3D) face reconstruction method, grounded in weakly supervised learning, to enhance player-player interactions within the Metaverse. The proposed method, two-dimensional (2D) face images, are effectively employed in a 2D Self-Supervised Learning (2DASL) approach, significantly optimizing 3D model learning outcomes and improving the quality of 3D face alignment in HCI systems. The work outlines the functional modules of the system, encompassing user interactions such as hugs and handshakes and communication through voice and text via blockchain. Solutions for managing multiple simultaneous online users are presented. Performance evaluation of the HCI system in a 3D reconstruction scene indicates that the 2DASL face reconstruction method achieves noteworthy results, enhancing the system's interaction capabilities by aiding 3D face modeling through 2D face images. The experimental system achieves a maximum processing speed of 18 frames of image data on a personal computer, meeting real-time processing requirements. User feedback regarding social acceptance, action interaction usability, emotions, and satisfaction with the VR interactive system reveals consistently high scores. The designed VR HCI system exhibits outstanding performance across diverse applications.
科研通智能强力驱动
Strongly Powered by AbleSci AI