论文标题

鲁棒点集:用于基准测试点云分类器的鲁棒性的数据集

RobustPointSet: A Dataset for Benchmarking Robustness of Point Cloud Classifiers

论文作者

Taghanaki, Saeid Asgari, Luo, Jieliang, Zhang, Ran, Wang, Ye, Jayaraman, Pradeep Kumar, Jatavallabhula, Krishna Murthy

论文摘要

在过去的几年中,3D深度学习社区在PointCloud处理方面取得了长足的进步。但是,已经培训了深层模型的数据集在很大程度上保持不变。大多数数据集都包含干净的,无杂波的点云,用于姿势。当在火车时呈现包含“看不见”的转换的数据时,在这些数据集上训练的模型以无法解释和不直觉的方式失败。尽管数据增强使模型可以强大地“以前看到”输入转换,但1)我们表明,这在推理过程中对看不见的转换不起作用,2)数据增强使得很难分析模型对转换的固有稳健性。为此,我们创建了一个可公开可用的数据集,用于对点云分类模型的鲁棒性分析(独立于数据增强)到输入转换,称为robustpointset。我们的实验表明,尽管点云分类的所有进展,但在转换的测试集评估时,没有一个始终如一的架构能够持续更好的表现 - 几个造成失败。我们还发现,不仅通过广泛的数据扩大而无法实现未见转型的鲁棒性。可以通过https://github.com/autodeskailab/robustpointset访问robustpointset。

The 3D deep learning community has seen significant strides in pointcloud processing over the last few years. However, the datasets on which deep models have been trained have largely remained the same. Most datasets comprise clean, clutter-free pointclouds canonicalized for pose. Models trained on these datasets fail in uninterpretible and unintuitive ways when presented with data that contains transformations "unseen" at train time. While data augmentation enables models to be robust to "previously seen" input transformations, 1) we show that this does not work for unseen transformations during inference, and 2) data augmentation makes it difficult to analyze a model's inherent robustness to transformations. To this end, we create a publicly available dataset for robustness analysis of point cloud classification models (independent of data augmentation) to input transformations, called RobustPointSet. Our experiments indicate that despite all the progress in the point cloud classification, there is no single architecture that consistently performs better -- several fail drastically -- when evaluated on transformed test sets. We also find that robustness to unseen transformations cannot be brought about merely by extensive data augmentation. RobustPointSet can be accessed through https://github.com/AutodeskAILab/RobustPointSet.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源