论文标题
深态:低量张量分解与深网先验
DeepTensor: Low-Rank Tensor Decomposition with Deep Network Priors
论文作者
论文摘要
DeepTensor是使用深生成网络的矩阵和张量低级别分解的计算有效框架。我们将张量分解为低级张量因子的乘积(例如,矩阵作为两个矢量的外乘积),在其中,每个低级数张量都是由深网(DN)产生的,该深层网络(DN)以一种自我监督的方式进行训练,以最大程度地减少均值近似近似误差。我们的主要观察结果是,DNS固有的隐式正则化使它们能够捕获超出经典线性方法(例如奇异值分解(SVD)和主成分分析(PCA))的非线性信号结构(例如歧管)。此外,与SVD和PCA相反,SVD和PCA的性能在张量的条目偏离添加剂白色高斯噪声时会恶化,我们证明了DeepTensor的性能对于广泛的分布范围很强。我们验证了DeepTensor是SVD,PCA,非负矩阵分解(NMF)和类似分解的强大和计算有效的液位替换,包括探索一系列现实世界应用,包括高光谱图像Denoising,3D MRI摄影和图像分类。尤其是,DeepTensor提供的6DB信噪比比标准的Denoising方法提高了泊松噪声损坏的信号,并学会了分解3D张量的速度60倍,比配备了3D卷积的单个DN快60倍。
DeepTensor is a computationally efficient framework for low-rank decomposition of matrices and tensors using deep generative networks. We decompose a tensor as the product of low-rank tensor factors (e.g., a matrix as the outer product of two vectors), where each low-rank tensor is generated by a deep network (DN) that is trained in a self-supervised manner to minimize the mean-squared approximation error. Our key observation is that the implicit regularization inherent in DNs enables them to capture nonlinear signal structures (e.g., manifolds) that are out of the reach of classical linear methods like the singular value decomposition (SVD) and principal component analysis (PCA). Furthermore, in contrast to the SVD and PCA, whose performance deteriorates when the tensor's entries deviate from additive white Gaussian noise, we demonstrate that the performance of DeepTensor is robust to a wide range of distributions. We validate that DeepTensor is a robust and computationally efficient drop-in replacement for the SVD, PCA, nonnegative matrix factorization (NMF), and similar decompositions by exploring a range of real-world applications, including hyperspectral image denoising, 3D MRI tomography, and image classification. In particular, DeepTensor offers a 6dB signal-to-noise ratio improvement over standard denoising methods for signals corrupted by Poisson noise and learns to decompose 3D tensors 60 times faster than a single DN equipped with 3D convolutions.