计算机科学
概化理论
亲密度
人工智能
变压器
机器学习
弹道
天文
数学
量子力学
统计
物理
数学分析
电压
作者
Li-Wu Tsao,Yankai Wang,Hao-Siang Lin,Hong-Han Shuai,Lai-Kuan Wong,Wen-Huang Cheng
标识
DOI:10.1007/978-3-031-20047-2_14
摘要
Earlier trajectory prediction approaches focus on ways of capturing sequential structures among pedestrians by using recurrent networks, which is known to have some limitations in capturing long sequence structures. To address this limitation, some recent works proposed Transformer-based architectures, which are built with attention mechanisms. However, these Transformer-based networks are trained end-to-end without capitalizing on the value of pre-training. In this work, we propose Social-SSL that captures cross-sequence trajectory structures via self-supervised pre-training, which plays a crucial role in improving both data efficiency and generalizability of Transformer networks for trajectory prediction. Specifically, Social-SSL models the interaction and motion patterns with three pretext tasks: interaction type prediction, closeness prediction, and masked cross-sequence to sequence pre-training. Comprehensive experiments show that Social-SSL outperforms the state-of-the-art methods by at least 12% and 20% on ETH/UCY and SDD datasets in terms of Average Displacement Error and Final Displacement Error (code available at https://github.com/Sigta678/Social-SSL .
科研通智能强力驱动
Strongly Powered by AbleSci AI