论文标题

用于图形正规张量完成的交替最小化算法

Alternating minimization algorithms for graph regularized tensor completion

论文作者

Guan, Yu, Dong, Shuyu, Gao, Bin, Absil, P. -A., Glineur, François

论文摘要

我们通过在CP因子矩阵上通过图Laplacian正则化结合外部成对相似性关系,来考虑一种针对低级张量完成(LRTC)的规范多核(CP)分解方法。图形正则化的用法需要在LRTC的学习准确性中受益,但与此同时,诱导耦合图laplacian术语,阻碍了张量完成模型的优化。为了求解图形调查的LRTC,我们通过利用基于CP基于CP分解的模型的块结构来提出有效的交替最小化算法。对于交替最小化的子问题,线性缀合物梯度子例程专门适用于图调查的LRTC。另外,我们通过使用乘数的交替说明方法来规避图形拉普拉斯术语的复杂耦合效应。基于Kurdyka-łojasiewicz属性,我们表明,全球所提出的算法生成的序列会收敛到目标函数的临界点。此外,还得出了复杂性和收敛速度。此外,与没有图形正则化的算法相比,包括合成数据和实际数据在内的数值实验表明,图形正则化张量完成模型可以改善恢复结果,并且所提出的算法可以提高到现有算法的时间效率。

We consider a Canonical Polyadic (CP) decomposition approach to low-rank tensor completion (LRTC) by incorporating external pairwise similarity relations through graph Laplacian regularization on the CP factor matrices. The usage of graph regularization entails benefits in the learning accuracy of LRTC, but at the same time, induces coupling graph Laplacian terms that hinder the optimization of the tensor completion model. In order to solve graph-regularized LRTC, we propose efficient alternating minimization algorithms by leveraging the block structure of the underlying CP decomposition-based model. For the subproblems of alternating minimization, a linear conjugate gradient subroutine is specifically adapted to graph-regularized LRTC. Alternatively, we circumvent the complicating coupling effects of graph Laplacian terms by using an alternating directions method of multipliers. Based on the Kurdyka-Łojasiewicz property, we show that the sequence generated by the proposed algorithms globally converges to a critical point of the objective function. Moreover, the complexity and convergence rate are also derived. In addition, numerical experiments including synthetic data and real data show that the graph regularized tensor completion model has improved recovery results compared to those without graph regularization, and that the proposed algorithms achieve gains in time efficiency over existing algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源