论文标题
盖式tabtransformer。为表格建模增强的深度学习体系结构
The GatedTabTransformer. An enhanced deep learning architecture for tabular modeling
论文作者
论文摘要
对将深度学习体系结构应用于表格数据的应用越来越感兴趣。最先进的解决方案之一是TabTransFormer,它结合了一个注意机制,以更好地跟踪分类特征之间的关系,然后使用标准MLP来输出其最终逻辑。在本文中,我们建议对原始TabTransFormer进行多次修改,在三个单独的数据集中,在二进制分类任务上更好地执行了超过1%的AUROC增益。受封闭式MLP的启发,在MLP块中实现了线性投影,并测试了多个激活功能。我们还评估了训练过程中特定超级参数的重要性。
There is an increasing interest in the application of deep learning architectures to tabular data. One of the state-of-the-art solutions is TabTransformer which incorporates an attention mechanism to better track relationships between categorical features and then makes use of a standard MLP to output its final logits. In this paper we propose multiple modifications to the original TabTransformer performing better on binary classification tasks for three separate datasets with more than 1% AUROC gains. Inspired by gated MLP, linear projections are implemented in the MLP block and multiple activation functions are tested. We also evaluate the importance of specific hyper parameters during training.