论文标题
Relu Networks Lipschitz常数的半格式优化
Semialgebraic Optimization for Lipschitz Constants of ReLU Networks
论文作者
论文摘要
网络的Lipschitz常数在深度学习的许多应用中起着重要作用,例如鲁棒性认证和Wasserstein生成的对抗网络。我们引入了一个半决赛编程层次结构,以估计多层深神经网络的全球和本地Lipschitz常数。新颖性是将Relu功能衍生物的多项式提升与Putinar阳性证书的概括相结合。这个想法也可能适用于机器学习中其他几乎稀疏的多项式优化问题。我们从经验上证明,我们的方法就最先进的线性编程方法提供了权衡,在某些情况下,我们在更少的时间内获得了更好的界限。
The Lipschitz constant of a network plays an important role in many applications of deep learning, such as robustness certification and Wasserstein Generative Adversarial Network. We introduce a semidefinite programming hierarchy to estimate the global and local Lipschitz constant of a multiple layer deep neural network. The novelty is to combine a polynomial lifting for ReLU functions derivatives with a weak generalization of Putinar's positivity certificate. This idea could also apply to other, nearly sparse, polynomial optimization problems in machine learning. We empirically demonstrate that our method provides a trade-off with respect to state of the art linear programming approach, and in some cases we obtain better bounds in less time.