论文标题

具有单点激活功能选择的神经网络

Neural Networks with A La Carte Selection of Activation Functions

论文作者

Sipper, Moshe

论文摘要

激活功能(AFS)对于神经网络的成功(或失败)至关重要,近年来受到了越来越多的关注,研究人员寻求设计新颖的AFS来改善网络性能的某些方面。 In this paper we take another direction, wherein we combine a slew of known AFs into successful architectures, proposing three methods to do so beneficially: 1) generate AF architectures at random, 2) use Optuna, an automatic hyper-parameter optimization software framework, with a Tree-structured Parzen Estimator (TPE) sampler, and 3) use Optuna with a Covariance Matrix Adaptation Evolution Strategy (CMA-ES)采样器。我们表明,与由Relu隐藏单元和SoftMax输出单元组成的标准网络相比,所有方法通常为25个分类问题产生更好的结果。带有TPE采样器的Optuna成为最佳的AF架构生产方法。

Activation functions (AFs), which are pivotal to the success (or failure) of a neural network, have received increased attention in recent years, with researchers seeking to design novel AFs that improve some aspect of network performance. In this paper we take another direction, wherein we combine a slew of known AFs into successful architectures, proposing three methods to do so beneficially: 1) generate AF architectures at random, 2) use Optuna, an automatic hyper-parameter optimization software framework, with a Tree-structured Parzen Estimator (TPE) sampler, and 3) use Optuna with a Covariance Matrix Adaptation Evolution Strategy (CMA-ES) sampler. We show that all methods often produce significantly better results for 25 classification problems when compared with a standard network composed of ReLU hidden units and a softmax output unit. Optuna with the TPE sampler emerged as the best AF architecture-producing method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源