论文标题
对称广义特征值问题是纳什平衡
The Symmetric Generalized Eigenvalue Problem as a Nash Equilibrium
论文作者
论文摘要
对称广义特征值问题(SGEP)是数值线性代数中的基本概念。它捕获了许多经典的机器学习问题的解决方案,例如规范相关分析,独立组件分析,部分最小二乘,线性判别分析,主要组件等。尽管如此,在处理流数据集(即Minibatches)时,大多数通用求解器的价格昂贵,而研究则集中在为特定问题实例找到有效的解决方案。在这项工作中,我们开发了顶级$ K $ SGEP的游戏理论公式,其NASH平衡是一组广义特征向量。我们还提出了一种可行的算法,并保证了与NASH的渐近收敛。当前的最新方法需要$ O(d^2k)$运行时复杂性,当尺寸数量($ d $)较大时,这非常昂贵。我们展示了如何修改这种并行方法以实现$ O(DK)$运行时复杂性。从经验上讲,我们证明了该结果算法能够解决各种SGEP问题实例,包括对神经网络激活的大规模分析。
The symmetric generalized eigenvalue problem (SGEP) is a fundamental concept in numerical linear algebra. It captures the solution of many classical machine learning problems such as canonical correlation analysis, independent components analysis, partial least squares, linear discriminant analysis, principal components and others. Despite this, most general solvers are prohibitively expensive when dealing with streaming data sets (i.e., minibatches) and research has instead concentrated on finding efficient solutions to specific problem instances. In this work, we develop a game-theoretic formulation of the top-$k$ SGEP whose Nash equilibrium is the set of generalized eigenvectors. We also present a parallelizable algorithm with guaranteed asymptotic convergence to the Nash. Current state-of-the-art methods require $O(d^2k)$ runtime complexity per iteration which is prohibitively expensive when the number of dimensions ($d$) is large. We show how to modify this parallel approach to achieve $O(dk)$ runtime complexity. Empirically we demonstrate that this resulting algorithm is able to solve a variety of SGEP problem instances including a large-scale analysis of neural network activations.