论文标题

用先验的功能值增强神经网络

Augmenting Neural Networks with Priors on Function Values

论文作者

Nisonoff, Hunter, Wang, Yixin, Listgarten, Jennifer

论文摘要

在自然科学中,在标签受限设置中进行功能估计的需求很常见。同时,在这些域中通常可以提供对功能值的先验知识。例如,基于无数据生物物理学的模型可以提供有关蛋白质特性的信息,而基于量子的计算可以提供有关小分子特性的信息。我们如何一致地利用此类先验知识来帮助改善在某些输入空间(通常靠近培训数据)但在其他区域中大错的神经网络模型?贝叶斯神经网络(BNN)使用户只能指定有关神经网络权重的先验信息,而不是直接在功能值上。而且,通常之间没有明确的映射。在此,我们通过开发一种增强BNN的方法来解决此问题,并具有有关功能值本身的先前信息。当认知不确定性较大时,我们的概率方法产生的预测会更严重地依赖于先前的信息,而当认知不确定性很小时,在神经网络上更严重。

The need for function estimation in label-limited settings is common in the natural sciences. At the same time, prior knowledge of function values is often available in these domains. For example, data-free biophysics-based models can be informative on protein properties, while quantum-based computations can be informative on small molecule properties. How can we coherently leverage such prior knowledge to help improve a neural network model that is quite accurate in some regions of input space -- typically near the training data -- but wildly wrong in other regions? Bayesian neural networks (BNN) enable the user to specify prior information only on the neural network weights, not directly on the function values. Moreover, there is in general no clear mapping between these. Herein, we tackle this problem by developing an approach to augment BNNs with prior information on the function values themselves. Our probabilistic approach yields predictions that rely more heavily on the prior information when the epistemic uncertainty is large, and more heavily on the neural network when the epistemic uncertainty is small.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源