论文标题

噪声采样横熵损失:通过成本数量注意的正规器改善差异回归

Noise-Sampling Cross Entropy Loss: Improving Disparity Regression Via Cost Volume Aware Regularizer

论文作者

Chen, Yang, Lu, Zongqing, Zhang, Xuechen, Chen, Lei, Liao, Qingmin

论文摘要

最近的端到端深度神经网络用于差异回归,已经达到了最新的性能。但是,在这些深度学习算法中省略了许多差异估计的特定特定特性。尤其是,匹配的成本量(最重要的过程之一)被视为以下软键式回归的正常中间功能,与传统算法相比,缺乏明确的约束。在本文中,受到以前的成本量定义的启发,我们提出了噪声采样的跨熵损失函数,以使深神经网络产生的成本量正规化为单峰和连贯。广泛的实验验证了提出的噪声采样横熵损失不仅可以帮助神经网络学习更多信息的成本量,而且与几种代表性算法相比,还可以提高立体声匹配性能。

Recent end-to-end deep neural networks for disparity regression have achieved the state-of-the-art performance. However, many well-acknowledged specific properties of disparity estimation are omitted in these deep learning algorithms. Especially, matching cost volume, one of the most important procedure, is treated as a normal intermediate feature for the following softargmin regression, lacking explicit constraints compared with those traditional algorithms. In this paper, inspired by previous canonical definition of cost volume, we propose the noise-sampling cross entropy loss function to regularize the cost volume produced by deep neural networks to be unimodal and coherent. Extensive experiments validate that the proposed noise-sampling cross entropy loss can not only help neural networks learn more informative cost volume, but also lead to better stereo matching performance compared with several representative algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源