论文标题

用于数据歧管的无监督隐式参数化的耗散残差层

Dissipative residual layers for unsupervised implicit parameterization of data manifolds

论文作者

Reshniak, Viktor

论文摘要

我们为数据歧管的隐式参数化提出了一种无监督的技术。在我们的方法中,假定数据属于较高维空间中的较低维歧管,并且数据点被视为源自歧管以外的轨迹的端点。在此假设下,数据歧管是要估计的动态系统的有吸引力的歧管。我们使用残留的神经网络对这种动态系统进行参数化,并提出一种光谱定位技术,以确保其在数据附近具有局部吸引力。我们还提出了提议的残差层的初始化和附加正规化。我们称为耗散瓶颈的%。我们提到了被考虑的问题对于强化学习和支持我们的讨论的任务的重要性,并证明了拟议层在变质和生成任务中的性能。

We propose an unsupervised technique for implicit parameterization of data manifolds. In our approach, the data is assumed to belong to a lower dimensional manifold in a higher dimensional space, and the data points are viewed as the endpoints of the trajectories originating outside the manifold. Under this assumption, the data manifold is an attractive manifold of a dynamical system to be estimated. We parameterize such a dynamical system with a residual neural network and propose a spectral localization technique to ensure it is locally attractive in the vicinity of data. We also present initialization and additional regularization of the proposed residual layers. % that we call dissipative bottlenecks. We mention the importance of the considered problem for the tasks of reinforcement learning and support our discussion with examples demonstrating the performance of the proposed layers in denoising and generative tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源