论文标题
使用最佳运输驱动循环gan,未配对的深度学习以加速MRI
Unpaired Deep Learning for Accelerated MRI using Optimal Transport Driven CycleGAN
论文作者
论文摘要
最近,尽管运行时复杂性大大降低,但由于其高性能重建,加速MRI的深度学习方法已得到了广泛的研究。这些神经网络通常以监督的方式进行训练,因此需要匹配的一对二次采样和完全采样的K空间数据。不幸的是,由于采集完全采样的K-Space数据需要长时间扫描时间,并且通常会导致采集协议的更改,因此通常很难获取完全采样的K-Space数据。因此,没有匹配的标签数据的不成熟的深度学习已成为一个非常重要的研究主题。在本文中,我们提出了一种使用最佳运输驱动的周期符合生成对抗网络(OT-Cyclegan)的未配对深度学习方法,该方法采用了一对生成器和歧视器。所提出的OT-Cyclegan架构是通过使用特殊设计的惩罚最小二乘成本的最佳传输配方的双重配方来得出的。实验结果表明,我们的方法可以从单线和多个线圈采集加速的K-空间数据中重建高分辨率MR图像,而无需匹配的参考数据。
Recently, deep learning approaches for accelerated MRI have been extensively studied thanks to their high performance reconstruction in spite of significantly reduced runtime complexity. These neural networks are usually trained in a supervised manner, so matched pairs of subsampled and fully sampled k-space data are required. Unfortunately, it is often difficult to acquire matched fully sampled k-space data, since the acquisition of fully sampled k-space data requires long scan time and often leads to the change of the acquisition protocol. Therefore, unpaired deep learning without matched label data has become a very important research topic. In this paper, we propose an unpaired deep learning approach using a optimal transport driven cycle-consistent generative adversarial network (OT-cycleGAN) that employs a single pair of generator and discriminator. The proposed OT-cycleGAN architecture is rigorously derived from a dual formulation of the optimal transport formulation using a specially designed penalized least squares cost. The experimental results show that our method can reconstruct high resolution MR images from accelerated k- space data from both single and multiple coil acquisition, without requiring matched reference data.