论文标题
NSM在松散的Lipschitz估计下收敛到K-NN回归剂
NSM Converges to a k-NN Regressor Under Loose Lipschitz Estimates
论文作者
论文摘要
尽管众所周知,具有准确的Lipschitz估计值对于某些模型提供良好的预测性能至关重要,但是在实践中,完善这个常数可能是一项艰巨的任务,尤其是当输入维度较高时。在这项工作中,我们阐明了在非线性集构件(NSM)框架中使用松散的Lipschitz边界的后果,这表明该模型会收敛到最近的邻居回归器(k-nn的k-nn,k = 1)。此外,这种收敛过程并不统一,在单变量的情况下是单调的。然后给出对结果的直观几何解释,并讨论其实际含义。
Although it is known that having accurate Lipschitz estimates is essential for certain models to deliver good predictive performance, refining this constant in practice can be a difficult task especially when the input dimension is high. In this work, we shed light on the consequences of employing loose Lipschitz bounds in the Nonlinear Set Membership (NSM) framework, showing that the model converges to a nearest neighbor regressor (k-NN with k=1). This convergence process is moreover not uniform, and is monotonic in the univariate case. An intuitive geometrical interpretation of the result is then given and its practical implications are discussed.