计算机科学
点云
变压器
云计算
人工智能
数据挖掘
水准点(测量)
特征学习
模式识别(心理学)
稀疏矩阵
特征(语言学)
机器学习
物理
哲学
操作系统
量子力学
高斯分布
电压
语言学
地理
大地测量学
作者
Yong Wang,Yangyang Liu,Pengbo Zhou,Guohua Geng,Qi Zhang
标识
DOI:10.1016/j.cag.2023.07.040
摘要
Compared to the traditional self-attention structure of Transformers, the MLP-like structure offers advantages such as simplicity and improved performance. However, effectively and efficiently learning features from sparse, irregular, and unordered 3D point cloud data remains a challenge. To address this issue, we propose SparseFormer, a sparse transformer network designed specifically for point cloud processing tasks. SparseFormer incorporates a sparse MLP module that enables accurate feature learning while considering the unique characteristics of 3D point cloud data. Additionally, we enhance the context information by utilizing a multi-scale feature aggregation module. Experimental results demonstrate the superior performance of SparseFormer on classification tasks using benchmark datasets, including the ModelNet40 synthetic dataset and the ScanObjectNN real-world dataset. In the classification experiment on the ScanObjectNN dataset, SparseFormer achieves a mean accuracy of 84.1% and an overall accuracy of 85.5%.
科研通智能强力驱动
Strongly Powered by AbleSci AI