计算机科学
稳健性(进化)
人工智能
机器人
计算机视觉
运动(物理)
人体运动
生物运动
感知
活动识别
实时计算
人机交互
生物化学
化学
神经科学
生物
基因
作者
Guozhen Zhu,Chenshu Wu,Xiaolu Zeng,Beibei Wang,K. J. Ray Liu
标识
DOI:10.1109/mass56207.2022.00073
摘要
Recently, an extensive amount of research has focused on indoor intelligent perception applications and systems. However, the performance of these applications can be greatly impacted by the movement of non-human subjects, such as pets, robots, and electrical appliances, making them impractical for mass use. In this paper, we present the first system that passively and unobtrusively distinguishes between moving human and non-human subjects by a single pair of commodity WiFi transceivers, without requiring the subjects to wear any device or move in a restricted area. Our system can detect the moving subjects, extract physically and statistically explainable features of their motion, and distinguish non-human and human movements accordingly. Leveraging the state-of-the-art rich-scattering multi-path model, our system can differentiate human and non-human motion through the wall, even in complex environments. Built on environment-independent features, our system can be applied to new environments without further effort from users. We validate the performance with commodity WiFi in four different buildings on subjects including the pet, vacuum robot, human, and fan. The results show that our system achieves 97.7% recognition accuracy and a 95.7% true positive rate for non-human motion recognition. Furthermore, it achieves 95.2% accuracy for unseen environments without model tuning, demonstrating its accuracy and robustness for ubiquitous use.
科研通智能强力驱动
Strongly Powered by AbleSci AI