论文标题

建立竞争性的协会分类器

Building a Competitive Associative Classifier

论文作者

Sood, Nitakshi, Zaiane, Osmar

论文摘要

随着深度学习的巨大成功,其他机器学习范式不得不替补。然而,其他模型,尤其是基于规则的模型,更可读和解释,当标记数据不丰富时,甚至可以具有竞争力。但是,大多数现有基于规则的分类器都遭受了大量分类规则的产生,从而影响了模型的可读性。这会阻碍分类准确性,因为嘈杂规则可能不会添加任何有用的信息,也会导致更长的分类时间。在这项研究中,我们提出了SIGD2,该SIGD2使用了一种新颖的,两阶段的修剪策略,该策略将大多数嘈杂,多余和无趣的规则修剪,并使分类模型更加准确和可读。为了使Sigdirect与最普遍但无法解释的基于机器学习的分类器(如神经网络和支持向量机器)更具竞争力,我们建议在Sigdirect分类器的集合上进行装袋和增强。所提出的算法的结果非常有前途,我们能够获得一组最少的统计学意义,以免损害分类准确性。我们使用15个UCI数据集并将我们的方法与八个现有系统进行比较。SIGD2和增强Sigdirect(Acboost)合奏模型的表现优于各种最先进的分类器,不仅在分类准确性方面,而且在规则数方面。

With the huge success of deep learning, other machine learning paradigms have had to take back seat. Yet other models, particularly rule-based, are more readable and explainable and can even be competitive when labelled data is not abundant. However, most of the existing rule-based classifiers suffer from the production of a large number of classification rules, affecting the model readability. This hampers the classification accuracy as noisy rules might not add any useful informationfor classification and also lead to longer classification time. In this study, we propose SigD2 which uses a novel, two-stage pruning strategy which prunes most of the noisy, redundant and uninteresting rules and makes the classification model more accurate and readable. To make SigDirect more competitive with the most prevalent but uninterpretable machine learning-based classifiers like neural networks and support vector machines, we propose bagging and boosting on the ensemble of the SigDirect classifier. The results of the proposed algorithms are quite promising and we are able to obtain a minimal set of statistically significant rules for classification without jeopardizing the classification accuracy. We use 15 UCI datasets and compare our approach with eight existing systems.The SigD2 and boosted SigDirect (ACboost) ensemble model outperform various state-of-the-art classifiers not only in terms of classification accuracy but also in terms of the number of rules.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源