论文标题

evoprounedeptl:一种用于转移学习的进化修剪模型,基于深层神经网络

EvoPruneDeepTL: An Evolutionary Pruning Model for Transfer Learning based Deep Neural Networks

论文作者

Poyatos, Javier, Molina, Daniel, Martinez, Aritz. D., Del Ser, Javier, Herrera, Francisco

论文摘要

近年来,深度学习模型在复杂的优化问题中表现出色。他们通常需要大型培训数据集,这在大多数实际情况下是一个限制。转移学习允许导入预训练的架构的第一层,并将它们连接到完全连接的层,以使其适应新问题。因此,这些层的配置对于模型的性能至关重要。不幸的是,这些模型的优化通常是一项计算要求的任务。优化深度学习模型的一种策略是修剪计划。修剪方法的重点是降低网络的复杂性,假设一旦修剪模型的预期性能惩罚。但是,使用优化算法识别并最终消除了神经元之间不必要的连接,可能会使用修剪来提高性能。这项工作提出了Evoprounedeeptl,这是一种用于转移学习的进化修剪模型,基于转移学习的深度神经网络,该模型用遗传算法优化的稀疏层取代了最后一个完全连接的层。根据其解决方案编码策略的不同,我们提出的模型可以在神经网络的密集连接部分上执行优化的修剪或特征选择。我们对几个数据集进行了不同的实验,以评估我们的建议的好处。结果表明,由于优化过程,EvoprounedeEptl和特征选择对网络的总体计算效率的贡献。特别是,精度得到提高,同时降低了最终层中活性神经元的数量。

In recent years, Deep Learning models have shown a great performance in complex optimization problems. They generally require large training datasets, which is a limitation in most practical cases. Transfer learning allows importing the first layers of a pre-trained architecture and connecting them to fully-connected layers to adapt them to a new problem. Consequently, the configuration of the these layers becomes crucial for the performance of the model. Unfortunately, the optimization of these models is usually a computationally demanding task. One strategy to optimize Deep Learning models is the pruning scheme. Pruning methods are focused on reducing the complexity of the network, assuming an expected performance penalty of the model once pruned. However, the pruning could potentially be used to improve the performance, using an optimization algorithm to identify and eventually remove unnecessary connections among neurons. This work proposes EvoPruneDeepTL, an evolutionary pruning model for Transfer Learning based Deep Neural Networks which replaces the last fully-connected layers with sparse layers optimized by a genetic algorithm. Depending on its solution encoding strategy, our proposed model can either perform optimized pruning or feature selection over the densely connected part of the neural network. We carry out different experiments with several datasets to assess the benefits of our proposal. Results show the contribution of EvoPruneDeepTL and feature selection to the overall computational efficiency of the network as a result of the optimization process. In particular, the accuracy is improved, reducing at the same time the number of active neurons in the final layers.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源