论文标题
动态最小二乘回归的复杂性
The Complexity of Dynamic Least-Squares Regression
论文作者
论文摘要
我们解决了动态最小二乘回归(LSR)的复杂性,其中排和标签$(\ mathbf {a}^{(t)},\ Mathbf {b}^{(t)})$可以被适应和/或删除,并且可以有效地插入$ -Applroxoximate $- - $- - $ \ min _ {\ mathbf {x}^{(t)}}} \ | \ Mathbf {a}^{(t)} \ Mathbf {x}^{(t)} - \ MathBf {b}^{(t)} \ | _2 $ for in [t] $ in [t] $。我们证明了敏锐的分离($ d^{2-o(1)} $ vs. $ \ sim d $)在:(i)完全与部分动态的$ 0.01 $ -LSR; (ii)在部分动态(仅插入)设置中的高相对于低准确性LSR。 我们的下限源于减少差距放大 - 让人联想到迭代精致 - ROM在线矩阵向量cosendure(OMV)[hkns15]的确切版本,也不在真实物质上不断近似omv,其中$ i $ i $ th在线产品$ \ m varybf {h} h} \ mathbf { $ 0.1 $ - 不相关的错误。所有先前从OMV到其近似版本的细粒度降低仅显示为逆多项式近似的硬度$ε= n^{ - ω(1)} $(添加或乘法)。该结果是对细粒复杂性和研究OMV猜想的独立感兴趣,而OMV猜想仍然广泛开放。
We settle the complexity of dynamic least-squares regression (LSR), where rows and labels $(\mathbf{A}^{(t)}, \mathbf{b}^{(t)})$ can be adaptively inserted and/or deleted, and the goal is to efficiently maintain an $ε$-approximate solution to $\min_{\mathbf{x}^{(t)}} \| \mathbf{A}^{(t)} \mathbf{x}^{(t)} - \mathbf{b}^{(t)} \|_2$ for all $t\in [T]$. We prove sharp separations ($d^{2-o(1)}$ vs. $\sim d$) between the amortized update time of: (i) Fully vs. Partially dynamic $0.01$-LSR; (ii) High vs. low-accuracy LSR in the partially-dynamic (insertion-only) setting. Our lower bounds follow from a gap-amplification reduction -- reminiscent of iterative refinement -- rom the exact version of the Online Matrix Vector Conjecture (OMv) [HKNS15], to constant approximate OMv over the reals, where the $i$-th online product $\mathbf{H}\mathbf{v}^{(i)}$ only needs to be computed to $0.1$-relative error. All previous fine-grained reductions from OMv to its approximate versions only show hardness for inverse polynomial approximation $ε= n^{-ω(1)}$ (additive or multiplicative) . This result is of independent interest in fine-grained complexity and for the investigation of the OMv Conjecture, which is still widely open.