论文标题

具有随机权重的神经网络的有效设计

Efficient Design of Neural Networks with Random Weights

论文作者

Patrikar, Ajay M.

论文摘要

具有随机权重的单层馈电网络以其非著作和快速训练算法而闻名,并且在各种分类和回归问题中都成功。这些网络的一个主要缺点是它们需要大量的隐藏单元。在本文中,我们提出了一种技术,可以大大减少隐藏单元的数量,而不会显着影响网络的准确性。我们介绍了主要和次要隐藏单元的概念。主要隐藏单元的重量是随机选择的,而二级隐藏单元是使用主隐藏单元的成对组合得出的。使用此技术,我们表明,隐藏单元的数量至少可以减少一个数量级。我们从实验上表明,这种技术在推理时导致计算大幅下降,并且对网络准确性的影响很小。如果可以接受略低的精度,则可以大大降低计算。

Single layer feedforward networks with random weights are known for their non-iterative and fast training algorithms and are successful in a variety of classification and regression problems. A major drawback of these networks is that they require a large number of hidden units. In this paper, we propose a technique to reduce the number of hidden units substantially without affecting the accuracy of the networks significantly. We introduce the concept of primary and secondary hidden units. The weights for the primary hidden units are chosen randomly while the secondary hidden units are derived using pairwise combinations of the primary hidden units. Using this technique, we show that the number of hidden units can be reduced by at least one order of magnitude. We experimentally show that this technique leads to significant drop in computations at inference time and has only a minor impact on network accuracy. A huge reduction in computations is possible if slightly lower accuracy is acceptable.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源