论文标题

朝着全局重建的全球神经网络摘要

Towards Global Neural Network Abstractions with Locally-Exact Reconstruction

论文作者

Manino, Edoardo, Bessa, Iury, Cordeiro, Lucas

论文摘要

神经网络是一类强大的非线性功能。但是,他们的黑框性质使得很难解释他们的行为并证明其安全性。抽象技术通过将神经网络转换为更简单,过度评估的功能来应对这一挑战。不幸的是,现有的抽象技术是松弛的,它将其适用性限制在输入域的小地方区域。在本文中,我们提出了全球间隔神经网络摘要,并提出了中心灭绝重建(Ginnacer)。我们的新型抽象技术在整个输入域上产生声音过度透明度的界限,同时保证任何给定局部输入的精确重建。我们的实验表明,Ginnacer比最先进的全球抽象技术更紧几个数量级,同时与当地竞争性竞争。

Neural networks are a powerful class of non-linear functions. However, their black-box nature makes it difficult to explain their behaviour and certify their safety. Abstraction techniques address this challenge by transforming the neural network into a simpler, over-approximated function. Unfortunately, existing abstraction techniques are slack, which limits their applicability to small local regions of the input domain. In this paper, we propose Global Interval Neural Network Abstractions with Center-Exact Reconstruction (GINNACER). Our novel abstraction technique produces sound over-approximation bounds over the whole input domain while guaranteeing exact reconstructions for any given local input. Our experiments show that GINNACER is several orders of magnitude tighter than state-of-the-art global abstraction techniques, while being competitive with local ones.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源