论文标题

Min-Max Bilevel多目标优化,并在机器学习中应用

Min-Max Bilevel Multi-objective Optimization with Applications in Machine Learning

论文作者

Gu, Alex, Lu, Songtao, Ram, Parikshit, Weng, Lily

论文摘要

我们考虑了一个通用的Min-Max多目标双光线优化问题,该问题与强大的机器学习中的应用有关,例如表示学习和超参数优化。我们设计了Morbit,这是一种新型的单环下降二聚体优化算法,以解决通用问题,并提出了一个新颖的分析,表明Morbit以$ \ wideTilde {\ Mathcal {\ Mathcal {O}}}}(n^n^n^{1/2} k^{1/2} k^$ n $ n of a g of of $ \ didetilde {\ natercal {\ natercal {\ natercal算法的$ k $迭代。我们的分析利用新颖的结果来处理非平滑的Min-Max多目标设置,并获得目标$ n $的sublinear依赖性。关于强大表示学习和强大的超参数优化展示柜的实验结果(i)考虑最小 - 最大多目标设置的优势,以及(ii)所提出的MORBIT的收敛性。我们的代码位于https://github.com/minimario/morbit。

We consider a generic min-max multi-objective bilevel optimization problem with applications in robust machine learning such as representation learning and hyperparameter optimization. We design MORBiT, a novel single-loop gradient descent-ascent bilevel optimization algorithm, to solve the generic problem and present a novel analysis showing that MORBiT converges to the first-order stationary point at a rate of $\widetilde{\mathcal{O}}(n^{1/2} K^{-2/5})$ for a class of weakly convex problems with $n$ objectives upon $K$ iterations of the algorithm. Our analysis utilizes novel results to handle the non-smooth min-max multi-objective setup and to obtain a sublinear dependence in the number of objectives $n$. Experimental results on robust representation learning and robust hyperparameter optimization showcase (i) the advantages of considering the min-max multi-objective setup, and (ii) convergence properties of the proposed MORBiT. Our code is at https://github.com/minimario/MORBiT.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源