论文标题
分析基于聚类的积极学习言语情感识别中的自我监督学习和降低维度降低方法
Analysis of Self-Supervised Learning and Dimensionality Reduction Methods in Clustering-Based Active Learning for Speech Emotion Recognition
论文作者
论文摘要
当需要域专家来执行复杂的机器学习任务的数据注释时,减少注释工作对于缩短时间和费用至关重要。对于没有可用注释的情况,一种方法是利用特征空间的结构进行基于聚类的活动(AL)方法。但是,这些方法在很大程度上取决于样品在特征空间中的组织方式以及使用哪种距离度量。无监督的方法(例如对比性预测编码(CPC))可以潜在地用于学习有组织的特征空间,但是这些方法通常会产生高维特征,这对于估计数据密度可能具有挑战性。在本文中,我们将CPC和多个维度降低方法结合在一起,以搜索基于聚类的AL的功能实践。我们用于模拟语音情感识别系统部署的实验表明,该特征空间的本地和全球拓扑都可以成功用于AL,并且CPC可用于改善基于聚类的AL性能,而不是传统信号功能。此外,我们观察到,压缩数据维度并不损害AL性能,并且当注释数量不是很低时,2-D特征表示与高维表示相似。
When domain experts are needed to perform data annotation for complex machine-learning tasks, reducing annotation effort is crucial in order to cut down time and expenses. For cases when there are no annotations available, one approach is to utilize the structure of the feature space for clustering-based active learning (AL) methods. However, these methods are heavily dependent on how the samples are organized in the feature space and what distance metric is used. Unsupervised methods such as contrastive predictive coding (CPC) can potentially be used to learn organized feature spaces, but these methods typically create high-dimensional features which might be challenging for estimating data density. In this paper, we combine CPC and multiple dimensionality reduction methods in search of functioning practices for clustering-based AL. Our experiments for simulating speech emotion recognition system deployment show that both the local and global topology of the feature space can be successfully used for AL, and that CPC can be used to improve clustering-based AL performance over traditional signal features. Additionally, we observe that compressing data dimensionality does not harm AL performance substantially, and that 2-D feature representations achieved similar AL performance as higher-dimensional representations when the number of annotations is not very low.