论文标题

功能多种学习

Functorial Manifold Learning

论文作者

Shiebler, Dan

论文摘要

我们适应了先前关于类别理论和拓扑无监督学习的研究,以对流形学习(也称为非线性维度降低)开发功能性观点。 我们首先将歧管学习算法描述为函数,将伪测量空间映射到优化目标以及通过层次聚类函数的因素。 然后,我们使用这种表征来证明在多种学习损失功能上的完善界限,并根据其epivariants构建流形学习算法的层次结构。我们在该层次结构的不同级别上表达了几种流行的多种学习算法,包括指标多维缩放,ISOMAP和UMAP。接下来,我们使用交织距离来研究一系列多种流形学习算法的稳定性。我们介绍了这些算法从嘈杂数据产生的嵌入方式的范围,近似它们从无声数据中学习的嵌入。最后,我们使用框架来得出一系列新型的多种多样学习算法,我们在实验上证明这与最新的状态具有竞争力。

We adapt previous research on category theory and topological unsupervised learning to develop a functorial perspective on manifold learning, also known as nonlinear dimensionality reduction. We first characterize manifold learning algorithms as functors that map pseudometric spaces to optimization objectives and that factor through hierarchical clustering functors. We then use this characterization to prove refinement bounds on manifold learning loss functions and construct a hierarchy of manifold learning algorithms based on their equivariants. We express several popular manifold learning algorithms as functors at different levels of this hierarchy, including Metric Multidimensional Scaling, IsoMap, and UMAP. Next, we use interleaving distance to study the stability of a broad class of manifold learning algorithms. We present bounds on how closely the embeddings these algorithms produce from noisy data approximate the embeddings they would learn from noiseless data. Finally, we use our framework to derive a set of novel manifold learning algorithms, which we experimentally demonstrate are competitive with the state of the art.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源