论文标题

爆发图像重建的一个可区分的两阶段对齐方案,并大幅移动

A Differentiable Two-stage Alignment Scheme for Burst Image Reconstruction with Large Shift

论文作者

Guo, Shi, Yang, Xi, Ma, Jianqi, Ren, Gaofeng, Zhang, Lei

论文摘要

DeNoising和Demosaicking是从原始数据中重建干净的全彩色图像的两个基本步骤。最近,通过使用在短时间内捕获的多个原始图像来重建单个高质量的图像,爆发图像的联合降解和演示(JDD)引起了很多关注。 JDD-B的一个主要挑战在于图像帧的牢固对齐。特征域中的最先进的对齐方法无法有效利用突发图像的时间信息,因为相机和对象运动通常存在较大的变化。此外,现代成像设备的较高分辨率(例如4K)导致帧之间更大的位移。为了应对这些挑战,我们在贴片和像素水平上依次设计了一个可区分的两阶段对准方案,以实现有效的JDD-B。首先,使用一种可区分的渐进式匹配方法将输入突发图像在补丁级别对齐,该方法可以估计远处框架之间的偏移量很小。然后,我们在全分辨率特征域中执行隐式像素对齐,以完善对齐结果。这两个阶段以端到端的方式共同培训。广泛的实验证明了我们方法比现有JDD-B方法的显着改善。代码可在https://github.com/guoshi28/2stagealign上找到。

Denoising and demosaicking are two essential steps to reconstruct a clean full-color image from the raw data. Recently, joint denoising and demosaicking (JDD) for burst images, namely JDD-B, has attracted much attention by using multiple raw images captured in a short time to reconstruct a single high-quality image. One key challenge of JDD-B lies in the robust alignment of image frames. State-of-the-art alignment methods in feature domain cannot effectively utilize the temporal information of burst images, where large shifts commonly exist due to camera and object motion. In addition, the higher resolution (e.g., 4K) of modern imaging devices results in larger displacement between frames. To address these challenges, we design a differentiable two-stage alignment scheme sequentially in patch and pixel level for effective JDD-B. The input burst images are firstly aligned in the patch level by using a differentiable progressive block matching method, which can estimate the offset between distant frames with small computational cost. Then we perform implicit pixel-wise alignment in full-resolution feature domain to refine the alignment results. The two stages are jointly trained in an end-to-end manner. Extensive experiments demonstrate the significant improvement of our method over existing JDD-B methods. Codes are available at https://github.com/GuoShi28/2StageAlign.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源