论文标题

个性化PCA:分离共享和独特功能

Personalized PCA: Decoupling Shared and Unique Features

论文作者

Shi, Naichen, Kontar, Raed Al

论文摘要

在本文中,我们应对PCA:异质性的重大挑战。当从不同趋势的不同来源收集数据的同时仍具有一致性的同时,在保留每个来源的独特功能的同时,提取共享知识至关重要。为此,我们提出了个性化的PCA(PERPCA),该PCA(PERPCA)使用相互正交的全球和本地主要组件来编码唯一和共享的功能。我们表明,在轻度条件下,即使协方差矩阵大不相同,也可以通过约束优化问题来识别和恢复独特的和共享的特征。此外,我们设计了一种完全由分布式的stiefel梯度下降来解决问题的完全联合算法。该算法引入了一组新的操作,称为通用缩回,以处理正交性约束,并且仅要求跨来源共享全局PC。我们证明了在合适的假设下算法的线性收敛。全面的数值实验突出了Perpca在特征提取和异质数据集预测方面的出色性能。作为一种将共享和独特功能从异质数据集解除共享和独特功能的系统方法,PERPCA在多个任务中找到了应用程序,包括视频细分,主题提取和功能群集。

In this paper, we tackle a significant challenge in PCA: heterogeneity. When data are collected from different sources with heterogeneous trends while still sharing some congruency, it is critical to extract shared knowledge while retaining the unique features of each source. To this end, we propose personalized PCA (PerPCA), which uses mutually orthogonal global and local principal components to encode both unique and shared features. We show that, under mild conditions, both unique and shared features can be identified and recovered by a constrained optimization problem, even if the covariance matrices are immensely different. Also, we design a fully federated algorithm inspired by distributed Stiefel gradient descent to solve the problem. The algorithm introduces a new group of operations called generalized retractions to handle orthogonality constraints, and only requires global PCs to be shared across sources. We prove the linear convergence of the algorithm under suitable assumptions. Comprehensive numerical experiments highlight PerPCA's superior performance in feature extraction and prediction from heterogeneous datasets. As a systematic approach to decouple shared and unique features from heterogeneous datasets, PerPCA finds applications in several tasks, including video segmentation, topic extraction, and feature clustering.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源