路径(计算)
算法
快速通道
启发式
节点(物理)
运动规划
计算机科学
过程(计算)
数学优化
对角线的
路径长度
数学
工程类
人工智能
机器人
操作系统
计算机网络
几何学
结构工程
程序设计语言
作者
Na Liu,Chiyue Ma,Zihang Hu,Pengfei Guo,Yun Ge,Min Tian
出处
期刊:Mathematical Biosciences and Engineering
[American Institute of Mathematical Sciences]
日期:2024-01-01
卷期号:21 (2): 2137-2162
被引量:2
摘要
<abstract> <p>This article proposes an improved A* algorithm aimed at improving the logistics path quality of automated guided vehicles (AGVs) in digital production workshops, solving the problems of excessive path turns and long transportation time. The traditional A* algorithm is improved internally and externally. In the internal improvement process, we propose an improved node search method within the A* algorithm to avoid generating invalid paths; offer a heuristic function which uses diagonal distance instead of traditional heuristic functions to reduce the number of turns in the path; and add turning weights in the A* algorithm formula, further reducing the number of turns in the path and reducing the number of node searches. In the process of external improvement, the output path of the internally improved A* algorithm is further optimized externally by the improved forward search optimization algorithm and the Bessel curve method, which reduces path length and turns and creates a path with fewer turns and a shorter distance. The experimental results demonstrate that the internally modified A* algorithm suggested in this research performs better when compared to six conventional path planning methods. Based on the internally improved A* algorithm path, the full improved A* algorithm reduces the turning angle by approximately 69% and shortens the path by approximately 10%; based on the simulation results, the improved A* algorithm in this paper can reduce the running time of AGV and improve the logistics efficiency in the workshop. Specifically, the walking time of AGV on the improved A* algorithm path is reduced by 12s compared to the traditional A* algorithm.</p> </abstract>
科研通智能强力驱动
Strongly Powered by AbleSci AI