传感器融合
代表(政治)
计算机科学
人工智能
计算机视觉
融合
模式识别(心理学)
政治
政治学
法学
语言学
哲学
作者
Yudi Chen,Zhi Xiong,Jianye Liu
出处
期刊:IEEE Sensors Journal
[Institute of Electrical and Electronics Engineers]
日期:2024-04-24
卷期号:24 (11): 18122-18132
标识
DOI:10.1109/jsen.2024.3390773
摘要
The mammalian brain manages navigational behavior by processing sensory information. A brain-inspired multisensor navigation information fusion model is developed based on discovered neural mechanisms. The architecture of this model is inspired by the information transmission method of a part of the brain, which integrates navigation information from multiple sensors to provide the position of unmanned systems. First, the brain-inspired multisensor information fusion architecture is established based on the anatomical structure of hippocampal formation. Then, continuous attractor neural networks are utilized to model head-direction cells, three-dimensional (3D) grid cells, and 3D place cells. These spatial representation cell models integrate external perceptual information and self-motion cues to generate firing rates, which realizes navigation information fusion and accurate spatial cognition for unmanned systems. Finally, the methods of decoding the firing rates of these spatial representation cells are proposed to obtain navigation parameters. The proposed model is verified on the simulated data, the KITTI dataset, and the unmanned ground vehicle. The experiments demonstrate that the proposed brain-inspired model can fuse multisensor information, leading to more accurate positioning than traditional navigation models.
科研通智能强力驱动
Strongly Powered by AbleSci AI