论文标题

激活功能对稀疏神经网络的影响

Activation function impact on Sparse Neural Networks

论文作者

Dubowski, Adam

论文摘要

尽管已经研究了稀疏的神经网络的概念,但研究人员直到最近才在此问题上取得了显着进展。与完全连接的模型相比,稀疏进化训练之类的技术可以显着降低计算复杂性。这通常发生在网络培训期间的重量创造和删除过程中。尽管有许多方法可以优化去除权重的重新分布,但似乎很少或根本没有研究激活功能对稀疏网络性能的影响。这项研究提供了对所使用的激活函数与网络性能之间各种稀疏度之间的关系的见解。

While the concept of a Sparse Neural Network has been researched for some time, researchers have only recently made notable progress in the matter. Techniques like Sparse Evolutionary Training allow for significantly lower computational complexity when compared to fully connected models by reducing redundant connections. That typically takes place in an iterative process of weight creation and removal during network training. Although there have been numerous approaches to optimize the redistribution of the removed weights, there seems to be little or no study on the effect of activation functions on the performance of the Sparse Networks. This research provides insights into the relationship between the activation function used and the network performance at various sparsity levels.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源