论文标题
使用批处理衰减和功能面膜归一化的特征选择
Feature Selection Using Batch-Wise Attenuation and Feature Mask Normalization
论文作者
论文摘要
特征选择通常被用作机器学习中最重要的预处理技术之一,因为它有助于降低数据的维度,并帮助研究人员和从业人员理解数据。因此,通过利用功能选择,可以预期的,记忆复杂性甚至数据量的计算消耗量更好,并且可以减少计算消耗。尽管存在利用深神网络进行特征选择的力量的方法,但其中许多经常患有敏感的超参数。本文提出了一个特征蒙版模块(FM模块),以基于新颖的批次衰减和特征掩模归一化,用于特征选择。所提出的方法几乎不含超参数,可以轻松地将其作为嵌入式特征选择方法集成到常见的神经网络中。关于流行图像,文本和语音数据集的实验表明,与其他基于深度学习的特征选择方法相比,我们的方法易于使用,并且具有出色的性能。
Feature selection is generally used as one of the most important preprocessing techniques in machine learning, as it helps to reduce the dimensionality of data and assists researchers and practitioners in understanding data. Thereby, by utilizing feature selection, better performance and reduced computational consumption, memory complexity and even data amount can be expected. Although there exist approaches leveraging the power of deep neural networks to carry out feature selection, many of them often suffer from sensitive hyperparameters. This paper proposes a feature mask module (FM-module) for feature selection based on a novel batch-wise attenuation and feature mask normalization. The proposed method is almost free from hyperparameters and can be easily integrated into common neural networks as an embedded feature selection method. Experiments on popular image, text and speech datasets have shown that our approach is easy to use and has superior performance in comparison with other state-of-the-art deep-learning-based feature selection methods.