计算机科学
计算机视觉
眼动
人眼
人工智能
校准
凝视
职位(财务)
可穿戴计算机
探测器
跟踪(教育)
数学
心理学
电信
教育学
统计
财务
经济
嵌入式系统
作者
Saori Yoshida,Masaharu Yoshikawa,Suguru Sangu
摘要
For daily use of AR technology, the development of smart glasses which look like ordinary eyeglasses has been accelerated, and eye-tracking devices to support their video expression and natural human-machine interfaces have been attracting attention. Over the past few years, we have been developing a non-video-based eye-tracking system, that consists of a VCSEL array and a position-sensitive detector (PSD), and is implemented on small glass devices without preventing device design and their appearance. In wearable eye-tracking devices, calibration is frequently required, when misalignments of the device and the user switching occur. The most common calibration method is having the user gazes at multiple fixed points, but it interrupts user’s activities and causes stress. To eliminate the calibration stress, a novel algorithm has been proposed, that estimates the shape and position of user’s eyes from continuously detected data and corrects the gaze direction while wearing glasses. The fundamental principle of this algorithm is that such eye parameters affect spatial characteristics of detected laser beam spot on the PSD, reflected on the eye surface. The Bayesian estimation is used to updates the probability distribution of unconscious eye movements, and the eye parameters are identified with the help of the canonical correlation analysis. In this paper, the details of the gaze detection algorithm with autonomous calibration mechanism have been described, and a ray-tracing simulation has been performed for a proof of concept. The results show the applicability of our proposed algorithm to provide an eye-tracking module without any user stress of calibration.
科研通智能强力驱动
Strongly Powered by AbleSci AI