论文标题

准确计算Relu网络的本地Lipschitz常数

Exactly Computing the Local Lipschitz Constant of ReLU Networks

论文作者

Jordan, Matt, Dimakis, Alexandros G.

论文摘要

神经网络的局部Lipschitz常数是一个有用的指标,具有鲁棒,概括和公平评估的应用。我们提供了新的分析结果,该结果将非平滑矢量值函数的本地Lipschitz常数与广义雅各布的规范相比。我们提出了一个足够的条件,反向传播始终返回广义雅各布的元素,并在此广泛的功能中重新构架问题。我们显示出强烈的不Xibibibibibibility结果,用于估计Relu网络的Lipschitz常数,然后制定算法以精确计算这些数量。我们利用该算法来评估竞争Lipschitz估计器的紧密度以及正规训练对Lipschitz常数的影响。

The local Lipschitz constant of a neural network is a useful metric with applications in robustness, generalization, and fairness evaluation. We provide novel analytic results relating the local Lipschitz constant of nonsmooth vector-valued functions to a maximization over the norm of the generalized Jacobian. We present a sufficient condition for which backpropagation always returns an element of the generalized Jacobian, and reframe the problem over this broad class of functions. We show strong inapproximability results for estimating Lipschitz constants of ReLU networks, and then formulate an algorithm to compute these quantities exactly. We leverage this algorithm to evaluate the tightness of competing Lipschitz estimators and the effects of regularized training on the Lipschitz constant.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源