论文标题

PatchNR:通过将流量正规化归一化图像从极少数图像中学习

PatchNR: Learning from Very Few Images by Patch Normalizing Flow Regularization

论文作者

Altekrüger, Fabian, Denker, Alexander, Hagemann, Paul, Hertrich, Johannes, Maass, Peter, Steidl, Gabriele

论文摘要

仅使用很少的可用信息学习神经网络是一个重要的研究主题,具有巨大的应用潜力。在本文中,我们引入了一个强大的正规化程序,用于成像中反问题的变异建模。我们的常规器称为贴剂归一化流量正常器(PatchNR),涉及在非常少的图像的小斑块上学习的归一化流。特别是,培训独立于考虑的逆问题,因此可以将相同的正规器应用于在同一类图像上作用的不同远期操作员。通过调查斑块的分布与整个图像类的分布,我们证明我们的模型确实是一种地图方法。低剂量和有限角度计算机断层扫描(CT)以及材料图像的超分辨率的数值示例表明,我们的方法提供了非常高质量的结果。训练集仅包含六个用于CT的图像和一张用于超分辨率的图像。最后,我们将PatchNR与内部学习的想法结合在一起,直接从低分辨率观察中执行自然图像的超分辨率,而无需了解任何高分辨率图像。

Learning neural networks using only few available information is an important ongoing research topic with tremendous potential for applications. In this paper, we introduce a powerful regularizer for the variational modeling of inverse problems in imaging. Our regularizer, called patch normalizing flow regularizer (patchNR), involves a normalizing flow learned on small patches of very few images. In particular, the training is independent of the considered inverse problem such that the same regularizer can be applied for different forward operators acting on the same class of images. By investigating the distribution of patches versus those of the whole image class, we prove that our model is indeed a MAP approach. Numerical examples for low-dose and limited-angle computed tomography (CT) as well as superresolution of material images demonstrate that our method provides very high quality results. The training set consists of just six images for CT and one image for superresolution. Finally, we combine our patchNR with ideas from internal learning for performing superresolution of natural images directly from the low-resolution observation without knowledge of any high-resolution image.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源