核(代数)
计算机科学
点云
点(几何)
代表(政治)
卷积(计算机科学)
算法
分割
欧几里得空间
人工智能
数学
离散数学
几何学
数学分析
人工神经网络
政治
政治学
法学
作者
Hugues Thomas,Charles R. Qi,Jean‐Emmanuel Deschaud,Beatriz Marcotegui,François Goulette,Leonidas Guibas
出处
期刊:Le Centre pour la Communication Scientifique Directe - HAL - Diderot
日期:2019-10-01
卷期号:: 6410-6419
被引量:2003
标识
DOI:10.1109/iccv.2019.00651
摘要
We present Kernel Point Convolution (KPConv), a new design of point convolution, i.e. that operates on point clouds without any intermediate representation. The convolution weights of KPConv are located in Euclidean space by kernel points, and applied to the input points close to them. Its capacity to use any number of kernel points gives KPConv more flexibility than fixed grid convolutions. Furthermore, these locations are continuous in space and can be learned by the network. Therefore, KPConv can be extended to deformable convolutions that learn to adapt kernel points to local geometry. Thanks to a regular subsampling strategy, KPConv is also efficient and robust to varying densities. Whether they use deformable KPConv for complex tasks, or rigid KPconv for simpler tasks, our networks outperform state-of-the-art classification and segmentation approaches on several datasets. We also offer ablation studies and visualizations to provide understanding of what has been learned by KPConv and to validate the descriptive power of deformable KPConv.
科研通智能强力驱动
Strongly Powered by AbleSci AI