论文标题

APTX:比Mish,Swish和Relu的变体更好的激活功能

APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning

论文作者

Kumar, Ravin

论文摘要

激活功能在深神网络中引入非线性。这种非线性有助于神经网络从数据集中更快,有效地学习。在深度学习中,基于问题陈述的类型开发和使用许多激活功能。 Relu的变体,Swish和Mish是Goto激活功能。 Mish功能被认为具有比Swish相似甚至更好的性能,并且比Relu更好。在本文中,我们提出了一个名为APTX的激活函数,其行为与Mish相似,但需要较少的数学操作来计算。 APTX的计算要求较小会加快模型培训的速度,从而减少了深度学习模型的硬件需求。源代码:https://github.com/mr-ravin/aptx_activation

Activation Functions introduce non-linearity in the deep neural networks. This nonlinearity helps the neural networks learn faster and efficiently from the dataset. In deep learning, many activation functions are developed and used based on the type of problem statement. ReLU's variants, SWISH, and MISH are goto activation functions. MISH function is considered having similar or even better performance than SWISH, and much better than ReLU. In this paper, we propose an activation function named APTx which behaves similar to MISH, but requires lesser mathematical operations to compute. The lesser computational requirements of APTx does speed up the model training, and thus also reduces the hardware requirement for the deep learning model. Source code: https://github.com/mr-ravin/aptx_activation

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源