论文标题
从大型基线深层同构图中学习边缘保存的图像缝线
Learning Edge-Preserved Image Stitching from Large-Baseline Deep Homography
论文作者
论文摘要
图像缝合是计算机视觉中的一种经典而关键的技术,旨在以广泛的视野生成图像。传统方法在很大程度上取决于特征检测,并要求场景特征在图像中密集且均匀分布,从而导致不同的幽灵效果和不良的鲁棒性。学习方法通常受到固定视图和输入尺寸限制的影响,表明对其他实际数据集缺乏概括能力。在本文中,我们提出了一个图像缝合学习框架,该框架由大型基线深层同型模块和一个边缘保存的变形模块组成。首先,我们提出了一个大基线深层同构模块,以估计参考图像和目标图像之间的准确投影转换。之后,旨在学习从边缘到内容的图像缝合的变形规则,尽可能地消除了幽灵效果。特别是,所提出的学习框架可以缝制任意视图和输入大小的图像,因此有助于一种有效的深层图像缝合方法,具有出色的其他真实图像的概括能力。实验结果表明,我们的同型模块在大型基线场景中显着优于现有的深层同构方法。在图像缝合中,我们的方法优于现有的学习方法,并通过最先进的传统方法显示出竞争性能。
Image stitching is a classical and crucial technique in computer vision, which aims to generate the image with a wide field of view. The traditional methods heavily depend on the feature detection and require that scene features be dense and evenly distributed in the image, leading to varying ghosting effects and poor robustness. Learning methods usually suffer from fixed view and input size limitations, showing a lack of generalization ability on other real datasets. In this paper, we propose an image stitching learning framework, which consists of a large-baseline deep homography module and an edge-preserved deformation module. First, we propose a large-baseline deep homography module to estimate the accurate projective transformation between the reference image and the target image in different scales of features. After that, an edge-preserved deformation module is designed to learn the deformation rules of image stitching from edge to content, eliminating the ghosting effects as much as possible. In particular, the proposed learning framework can stitch images of arbitrary views and input sizes, thus contribute to a supervised deep image stitching method with excellent generalization capability in other real images. Experimental results demonstrate that our homography module significantly outperforms the existing deep homography methods in the large baseline scenes. In image stitching, our method is superior to the existing learning method and shows competitive performance with state-of-the-art traditional methods.