论文标题

比例范围和不变的高斯衍生网络

Scale-covariant and scale-invariant Gaussian derivative networks

论文作者

Lindeberg, Tony

论文摘要

本文介绍了尺度空间理论和深度学习之间的混合方法,其中深度学习是通过在级联中耦合的参数规模空间操作来构建的。通过在多个刻度通道之间共享学习的参数,并在缩放变换下使用比例空间原语的变换属性,所得的网络成为可证明的比例协变。另外,通过在多个刻度通道上执行最大池,用于图像分类的最终网络体系结构也将成为规模不变。我们研究了此类网络在MNISTLARGESCALE数据集上的性能,该数据集包含来自原始MNIST的图像,超过4个,涉及培训数据,超过16倍有关测试数据。证明所得的方法可以进行比例概括,从而在训练数据中不存在的尺度上对模式进行良好的性能。

This paper presents a hybrid approach between scale-space theory and deep learning, where a deep learning architecture is constructed by coupling parameterized scale-space operations in cascade. By sharing the learnt parameters between multiple scale channels, and by using the transformation properties of the scale-space primitives under scaling transformations, the resulting network becomes provably scale covariant. By in addition performing max pooling over the multiple scale channels, a resulting network architecture for image classification also becomes provably scale invariant. We investigate the performance of such networks on the MNISTLargeScale dataset, which contains rescaled images from original MNIST over a factor of 4 concerning training data and over a factor of 16 concerning testing data. It is demonstrated that the resulting approach allows for scale generalization, enabling good performance for classifying patterns at scales not present in the training data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源