论文标题
在受惩罚的非负矩阵分解中优化超参数的双级算法
Bi-level algorithm for optimizing hyperparameters in penalized nonnegative matrix factorization
论文作者
论文摘要
学习方法依赖于影响算法性能并影响数据的知识提取过程的超参数。最近,作为学习算法,非负矩阵分解(NMF)引起了人们日益增长的兴趣。该技术可在保留特征属性的同时捕获嵌入在大型数据集中的潜在信息。 NMF可以被形式化为惩罚优化任务,在该任务中,调整罚款超标剂是一个空旷的问题。当前的文献没有提供任何解决此任务的一般框架。这项研究提议以双层优化的方式表达NMF中的罚款超参数问题。我们设计了一种新型算法,称为交替的双级(AltBi),该算法将超参数调谐程序纳入NMF因素的更新中。在适当的假设下,研究了数值解的存在和收敛的结果,并提供了数值实验。
Learning approaches rely on hyperparameters that impact the algorithm's performance and affect the knowledge extraction process from data. Recently, Nonnegative Matrix Factorization (NMF) has attracted a growing interest as a learning algorithm. This technique captures the latent information embedded in large datasets while preserving feature properties. NMF can be formalized as a penalized optimization task in which tuning the penalty hyperparameters is an open issue. The current literature does not provide any general framework addressing this task. This study proposes to express the penalty hyperparameters problem in NMF in terms of a bi-level optimization. We design a novel algorithm, named Alternating Bi-level (AltBi), which incorporates the hyperparameters tuning procedure into the updates of NMF factors. Results of the existence and convergence of numerical solutions, under appropriate assumptions, are studied, and numerical experiments are provided.