论文标题

估计神经网络中的乘法关系

Estimating Multiplicative Relations in Neural Networks

论文作者

Goel, Bhaavan

论文摘要

通用近似定理表明,浅神经网络可以近似任何功能。每层神经元的输入是先前层神经元的加权总和,然后应用激活。当输出是输入数据的线性组合时,这些激活函数的性能非常好。当尝试学习涉及输入数据产物的函数时,神经网络倾向于过度拟合数据以近似函数。在本文中,我们将使用对数函数的属性提出一对激活函数,该功能可以将产品转化为线性表达并使用反向传播学习。我们将尝试将这种方法推广到某些复杂的算术功能,并测试与训练集的分离分布的准确性。

Universal approximation theorem suggests that a shallow neural network can approximate any function. The input to neurons at each layer is a weighted sum of previous layer neurons and then an activation is applied. These activation functions perform very well when the output is a linear combination of input data. When trying to learn a function which involves product of input data, the neural networks tend to overfit the data to approximate the function. In this paper we will use properties of logarithmic functions to propose a pair of activation functions which can translate products into linear expression and learn using backpropagation. We will try to generalize this approach for some complex arithmetic functions and test the accuracy on a disjoint distribution with the training set.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源