人工智能
计算机视觉
简单(哲学)
计算机科学
丝绸
哲学
认识论
操作系统
作者
Jianjun Yao,Yingzhao Li
出处
期刊:Industrial Robot-an International Journal
[Emerald (MCB UP)]
日期:2024-03-11
标识
DOI:10.1108/ir-11-2023-0309
摘要
Purpose Weak repeatability is observed in handcrafted keypoints, leading to tracking failures in visual simultaneous localization and mapping (SLAM) systems under challenging scenarios such as illumination change, rapid rotation and large angle of view variation. In contrast, learning-based keypoints exhibit higher repetition but entail considerable computational costs. This paper proposes an innovative algorithm for keypoint extraction, aiming to strike an equilibrium between precision and efficiency. This paper aims to attain accurate, robust and versatile visual localization in scenes of formidable complexity. Design/methodology/approach SiLK-SLAM initially refines the cutting-edge learning-based extractor, SiLK, and introduces an innovative postprocessing algorithm for keypoint homogenization and operational efficiency. Furthermore, SiLK-SLAM devises a reliable relocalization strategy called PCPnP, leveraging progressive and consistent sampling, thereby bolstering its robustness. Findings Empirical evaluations conducted on TUM, KITTI and EuRoC data sets substantiate SiLK-SLAM’s superior localization accuracy compared to ORB-SLAM3 and other methods. Compared to ORB-SLAM3, SiLK-SLAM demonstrates an enhancement in localization accuracy even by 70.99%, 87.20% and 85.27% across the three data sets. The relocalization experiments demonstrate SiLK-SLAM’s capability in producing precise and repeatable keypoints, showcasing its robustness in challenging environments. Originality/value The SiLK-SLAM achieves exceedingly elevated localization accuracy and resilience in formidable scenarios, holding paramount importance in enhancing the autonomy of robots navigating intricate environments. Code is available at https://github.com/Pepper-FlavoredChewingGum/SiLK-SLAM .
科研通智能强力驱动
Strongly Powered by AbleSci AI