论文标题

监控蒸馏以进行正同时完成的正同时完成

Monitored Distillation for Positive Congruent Depth Completion

论文作者

Liu, Tian Yu, Agrawal, Parth, Chen, Allison, Hong, Byung-Woo, Wong, Alex

论文摘要

我们提出了一种从单个图像,其校准和相关的稀疏点云中推断出密集的深度图的方法。为了利用产生推定深度图的现有模型(教师),我们提出了一种自适应知识蒸馏方法,该方法产生了积极的一致培训过程,其中,学生模型避免学习教师的错误模式。在没有进行模型选择和训练的地面真相的情况下,我们的方法称为受监控的蒸馏,使学生能够通过选择性地从预测中学习,从而可以最大程度地减少给定图像的重建错误,从而利用教师的盲目合奏。受到监测的蒸馏可得出蒸馏的深度图和置信图或``监视器'',以符合特定老师的预测符合观察到的图像的效果。监视器会适应蒸馏的深度,如果所有教师都表现出较高的残留物,则标准的无监督图像重建损失接管为监督信号。在室内场景(void)上,我们的表现较高的盲人结合基线的基准量约为17.53%,而无监督的方法则高达24.25%。我们拥有79%的模型尺寸减小,同时保持可比的性能与最佳监督方法。对于户外活动(Kitti),尽管不使用地面真相,但我们在基准测试中排名第五。代码可在以下网址提供:https://github.com/alexklwong/mondi-python。

We propose a method to infer a dense depth map from a single image, its calibration, and the associated sparse point cloud. In order to leverage existing models (teachers) that produce putative depth maps, we propose an adaptive knowledge distillation approach that yields a positive congruent training process, wherein a student model avoids learning the error modes of the teachers. In the absence of ground truth for model selection and training, our method, termed Monitored Distillation, allows a student to exploit a blind ensemble of teachers by selectively learning from predictions that best minimize the reconstruction error for a given image. Monitored Distillation yields a distilled depth map and a confidence map, or ``monitor'', for how well a prediction from a particular teacher fits the observed image. The monitor adaptively weights the distilled depth where if all of the teachers exhibit high residuals, the standard unsupervised image reconstruction loss takes over as the supervisory signal. On indoor scenes (VOID), we outperform blind ensembling baselines by 17.53% and unsupervised methods by 24.25%; we boast a 79% model size reduction while maintaining comparable performance to the best supervised method. For outdoors (KITTI), we tie for 5th overall on the benchmark despite not using ground truth. Code available at: https://github.com/alexklwong/mondi-python.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源