亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整的填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

EG-Net: Appearance-based eye gaze estimation using an efficient gaze network with attention mechanism

计算机科学 凝视 人工智能 卷积神经网络 计算机视觉 面子(社会学概念) 眼动 特征(语言学) 集合(抽象数据类型) 姿势 任务(项目管理) 模式识别(心理学) 社会科学 语言学 哲学 管理 社会学 经济 程序设计语言
作者
Xinmei Wu,Lin Li,Haihong Zhu,Gang Zhou,Linfeng Li,Fei Su,Shen He,Yang‐Gang Wang,Xue Long
出处
期刊:Expert Systems With Applications [Elsevier]
卷期号:238: 122363-122363 被引量:1
标识
DOI:10.1016/j.eswa.2023.122363
摘要

Gaze estimation, which has a wide range of applications in many scenarios, is a challenging task due to various unconstrained conditions. As information from both full-face and eye images is instrumental in improving gaze estimation, many multiregion gaze estimation models have been proposed in recent studies. However, most of them simply use the same regression method on both eye and face images, overlooking that the eye region may contribute more fine-grained features than the full-face region, and the variation in the left and right eyes of an individual caused by head pose, illumination, and partially occluded eye may lead to inconsistent estimations. To address these issues, we propose an appearance-based end-to-end learning network architecture with an attention mechanism, named efficient gaze network (EG-Net), which employs a two-branch network for gaze estimation. Specifically, a base CNN is utilized for full-face images, while an efficient eye network (EE-Net), which is scaled up from the base CNN, is used for left- and right-eye images. EE-Net uniformly scales up the depth, width and resolution of the base CNN with a set of constant coefficients for eye feature extraction and adaptively weights the left- and right-eye images via an attention network according to its "image quality". Finally, features from the full-face image, two individual eye images and head pose vectors are fused to regress the eye gaze vectors. We evaluate our approach on 3 public datasets, the proposed EG-Net model achieves much better performance. In particular, our EG-Net-v4 model outperforms state-of-the-art approaches on the MPIIFaceGaze dataset, with prediction errors of 2.41 cm and 2.76 degrees in 2D and 3D gaze estimation, respectively. It also yields a performance improvement to 1.58 cm on GazeCapture and 4.55 degrees on EyeDIAP dataset, with 23.4 % and 14.2 % improvement over prior arts on the two datasets respectively. The code related to this project is open-source and available at https://github.com/wuxinmei/EE_Net.git.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
12秒前
sny完成签到,获得积分20
27秒前
板栗鸡完成签到 ,获得积分10
44秒前
46秒前
搜集达人应助科研通管家采纳,获得10
46秒前
共享精神应助科研通管家采纳,获得10
46秒前
ellen发布了新的文献求助10
50秒前
奔跑的小熊完成签到,获得积分10
52秒前
54秒前
ellen完成签到,获得积分10
57秒前
1分钟前
万能图书馆应助Muller采纳,获得10
1分钟前
1分钟前
Jack80完成签到,获得积分0
1分钟前
科研通AI2S应助Jack80采纳,获得10
2分钟前
张大然完成签到 ,获得积分10
2分钟前
香蕉觅云应助科研通管家采纳,获得10
2分钟前
呆瓜完成签到,获得积分10
2分钟前
3分钟前
zxd发布了新的文献求助10
3分钟前
CC发布了新的文献求助10
3分钟前
3分钟前
半生完成签到 ,获得积分10
3分钟前
3分钟前
NexusExplorer应助小样采纳,获得30
4分钟前
4分钟前
4分钟前
vivid完成签到,获得积分10
5分钟前
5分钟前
黄惠完成签到 ,获得积分10
5分钟前
6分钟前
6分钟前
6分钟前
小样发布了新的文献求助30
6分钟前
LYT发布了新的文献求助10
6分钟前
FashionBoy应助爱吃蒸蛋采纳,获得10
6分钟前
香蕉觅云应助LYT采纳,获得10
6分钟前
6分钟前
zxd发布了新的文献求助10
6分钟前
爱吃蒸蛋发布了新的文献求助10
6分钟前
高分求助中
Solution Manual for Strategic Compensation A Human Resource Management Approach 1200
Natural History of Mantodea 螳螂的自然史 1000
Glucuronolactone Market Outlook Report: Industry Size, Competition, Trends and Growth Opportunities by Region, YoY Forecasts from 2024 to 2031 800
A Photographic Guide to Mantis of China 常见螳螂野外识别手册 800
Zeitschrift für Orient-Archäologie 500
Smith-Purcell Radiation 500
Autoregulatory progressive resistance exercise: linear versus a velocity-based flexible model 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 物理化学 催化作用 细胞生物学 免疫学 冶金
热门帖子
关注 科研通微信公众号,转发送积分 3341839
求助须知:如何正确求助?哪些是违规求助? 2969202
关于积分的说明 8637753
捐赠科研通 2648899
什么是DOI,文献DOI怎么找? 1450412
科研通“疑难数据库(出版商)”最低求助积分说明 671913
邀请新用户注册赠送积分活动 660986