论文标题
连续和分段线性函数的稳定参数化
Stable Parametrization of Continuous and Piecewise-Linear Functions
论文作者
论文摘要
在深度学习中发挥着重要作用,产生连续和分段线性(CPWL)功能,它起着重要的作用。尽管它们提供了强大的参数表示,但参数和函数空间之间的映射缺乏稳定性。在本文中,我们研究了CPWL功能的替代表示,该函数依赖于本地HAT基础功能。这是基于以下事实:任何CPWL函数都可以通过三角剖分及其值在网格点处指定。我们为帽子函数形成riesz基础的三角剖分(在任何数量的维度中)提供了必要的条件,从而确保参数与相应的CPWL函数之间的链接稳定且独特。此外,我们还提供了$ \ ell_2 \ rightarrow l_2 $条件编号的估算值。最后,作为我们框架的特殊情况,我们专注于$ \ mathbb {r}^d $的系统参数化,并放置在均匀网格上的控制点。特别是,我们选择单个线性盒样条的移动复制品的HAT基函数。在这种情况下,我们证明了我们对条件号的一般估计是最佳的。我们还基于因果关系函数的变化,将我们的局部表示与非局部表示。
Rectified-linear-unit (ReLU) neural networks, which play a prominent role in deep learning, generate continuous and piecewise-linear (CPWL) functions. While they provide a powerful parametric representation, the mapping between the parameter and function spaces lacks stability. In this paper, we investigate an alternative representation of CPWL functions that relies on local hat basis functions. It is predicated on the fact that any CPWL function can be specified by a triangulation and its values at the grid points. We give the necessary and sufficient condition on the triangulation (in any number of dimensions) for the hat functions to form a Riesz basis, which ensures that the link between the parameters and the corresponding CPWL function is stable and unique. In addition, we provide an estimate of the $\ell_2\rightarrow L_2$ condition number of this local representation. Finally, as a special case of our framework, we focus on a systematic parametrization of $\mathbb{R}^d$ with control points placed on a uniform grid. In particular, we choose hat basis functions that are shifted replicas of a single linear box spline. In this setting, we prove that our general estimate of the condition number is optimal. We also relate our local representation to a nonlocal one based on shifts of a causal ReLU-like function.