计算机科学
手势
认证(法律)
手势识别
语音识别
人机交互
计算机视觉
人工智能
计算机网络
计算机安全
作者
Yong Wang,Tianyu Yang,Chunxiao Wang,Feng Li,Pengfei Hu,Yiran Shen
出处
期刊:IEEE Internet of Things Journal
[Institute of Electrical and Electronics Engineers]
日期:2024-03-25
卷期号:11 (12): 22007-22020
被引量:2
标识
DOI:10.1109/jiot.2024.3380811
摘要
The surge in popularity of wireless headphones, particularly wireless earbuds, as smart wearables, has been notable in recent years. These devices, empowered by artificial intelligence (AI), are broadening their utility in areas such as speech recognition, augmented reality, pose recognition, and health care monitoring, thereby enriching user experiences through novel interactive interfaces driven by embedded sensors. However, the widespread adoption of wireless earbuds has spurred concerns regarding security and privacy, necessitating robust bespoke security measures. Despite the miniaturization of mobile chips enabling the integration of sophisticated algorithms into smart wearables, the research and industrial communities have yet to accord adequate attention to earbud security. This paper focuses on empowering wireless earbuds to authenticate their legitimate users, tackling the challenges associated with conventional authentication methods. Instead of relying on input interface authentication methods like PIN or lock patterns, this research delves into leveraging Inertial Measurement Unit (IMU) data collected during interactions with devices to extract novel biometric features, presenting an alternative approach that nonetheless confronts challenges related to signal capture and interference. Consequently, we propose and design BudsAuth, an implicit user authentication framework that harnesses built-in IMU sensors in smart earbuds to capture vibration signals induced by on-face touching interactions with the earbuds. These vibrations are utilized to deliver continuous and implicit user authentication with high precision and compatibility across various earbud models. Extensive evaluation demonstrates BudsAuth's capability to achieve an Equal Error Rate (EER) of 0.0003, representing an approximate 99.97% accuracy with seven consecutive samples of interactive gestures for implicit authentication.
科研通智能强力驱动
Strongly Powered by AbleSci AI