论文标题
基于深度学习的图像分类的激活功能的演变
Evolution of Activation Functions for Deep Learning-Based Image Classification
论文作者
论文摘要
激活功能(AFS)在神经网络的性能中起关键作用。整流的线性单元(RELU)当前是最常用的AF。已经提出了几个替代者,但事实证明,改进措施不一致。一些AFS在特定任务中表现出更好的性能,但是很难先了解如何选择合适的任务。在研究标准完全连接的神经网络(FCN)和卷积神经网络(CNN)时,我们提出了一种新颖的,三个填充,共同进化算法来进化AFS,并将其与其他四种方法进行比较,即进化和非进化。在四个数据集(MNIST,FashionMnist,KMNIST和USPS)上进行了测试,共同进化被证明是找到良好的AFS和AF体系结构的性能算法。
Activation functions (AFs) play a pivotal role in the performance of neural networks. The Rectified Linear Unit (ReLU) is currently the most commonly used AF. Several replacements to ReLU have been suggested but improvements have proven inconsistent. Some AFs exhibit better performance for specific tasks, but it is hard to know a priori how to select the appropriate one(s). Studying both standard fully connected neural networks (FCNs) and convolutional neural networks (CNNs), we propose a novel, three-population, coevolutionary algorithm to evolve AFs, and compare it to four other methods, both evolutionary and non-evolutionary. Tested on four datasets -- MNIST, FashionMNIST, KMNIST, and USPS -- coevolution proves to be a performant algorithm for finding good AFs and AF architectures.