材料科学
纳米技术
模式识别(心理学)
人机交互
人工智能
计算机科学
作者
Yijing Xu,Shifan Yu,Lei Liu,Wansheng Lin,Zhicheng Cao,Yong‐Min Liang,Jiming Duan,Zhigang Huang,Chao Wei,Ziquan Guo,Tingzhu Wu,Zhong Chen,Qingliang Liao,Yuanjin Zheng,Xinqin Liao
标识
DOI:10.1002/adfm.202411331
摘要
Abstract Tactile intent recognition systems, which are highly desired to satisfy human's needs and humanized services, shall be accurately understanding and identifying human's intent. They generally utilize time‐driven sensor arrays to achieve high spatiotemporal resolution, however, which encounter inevitable challenges of low scalability, huge data volumes, and complex processing. Here, an event‐driven intent recognition touch sensor (IR touch sensor) with in‐sensor computing capability is presented. The merit of event‐driven and in‐sensor computing enables the IR touch sensor to achieve ultrahigh resolution and obtain complete intent information with intrinsic concise data. It achieves critical signal extraction of action trajectories with a rapid response time of 0.4 ms and excellent durability of >10 000 cycles, bringing an important breakthrough of tactile intent recognition. Versatile applications prove the integrated functions of the IR touch sensor for great interactive potential in all‐weather environments regardless of shading, dynamics, darkness, and noise. Unconscious and even hidden action features can be perfectly extracted with the ultrahigh recognition accuracy of 98.4% for intent recognition. The further auxiliary diagnostic test demonstrates the practicability of the IR touch sensor in telemedicine palpation and therapy. This groundbreaking integration of sensing, data reduction, and ultrahigh‐accuracy recognition will propel the leapfrog development for conscious machine intelligence.
科研通智能强力驱动
Strongly Powered by AbleSci AI