论文标题

$ \ ell_ \ infty $保证和双面Kakeya边界的线性散列

Linear Hashing with $\ell_\infty$ guarantees and two-sided Kakeya bounds

论文作者

Dhar, Manik, Dvir, Zeev

论文摘要

我们表明,有限字段上随机选择的线性图在$ \ ell_ \ infty $ sense中具有良好的哈希功能。更具体地说,请考虑一组$ s \ subset \ mathbb {f} _q^n $和一个随机选择的线性映射$ l:\ mathbb {f} _q^n \ to \ mathbb {f} _q^t $,带有$ q^t $的$ q^t $,以$ q^t $的较小而小于$ |令$ u_s $表示在$ s $上均匀分布的随机变量。我们的主要定理表明,在选择$ l $的可能性高的可能性时,随机变量$ l(u_s)$在$ \ ell_ \ infty $ norm中接近统一。换句话说,{\ em every}元素在范围内$ \ mathbb {f} _q^t $在映射到其的$ s $中具有大约相同数量的元素。这补充了广泛使用的剩余哈希引理(LHL),该哈希引理(LHL)证明了统计学下的模拟语句,或$ \ ell_1 $,距离(对于较丰富的功能)以及对线性hash函数中预期的最大“桶大小”的先前工作[AMPPT99]。通过来自负载平衡文献的已知边界[RS98],我们的结果很紧,表明线性函数哈希以及trully随机函数,直至熵损失的恒定因素。我们的证明利用了线性哈希与有限场Kakeya问题之间的联系,并扩展了该领域中开发的一些工具,尤其是多项式方法。

We show that a randomly chosen linear map over a finite field gives a good hash function in the $\ell_\infty$ sense. More concretely, consider a set $S \subset \mathbb{F}_q^n$ and a randomly chosen linear map $L : \mathbb{F}_q^n \to \mathbb{F}_q^t$ with $q^t$ taken to be sufficiently smaller than $ |S|$. Let $U_S$ denote a random variable distributed uniformly on $S$. Our main theorem shows that, with high probability over the choice of $L$, the random variable $L(U_S)$ is close to uniform in the $\ell_\infty$ norm. In other words, {\em every} element in the range $\mathbb{F}_q^t$ has about the same number of elements in $S$ mapped to it. This complements the widely-used Leftover Hash Lemma (LHL) which proves the analog statement under the statistical, or $\ell_1$, distance (for a richer class of functions) as well as prior work on the expected largest 'bucket size' in linear hash functions [ADMPT99]. By known bounds from the load balancing literature [RS98], our results are tight and show that linear functions hash as well as trully random function up to a constant factor in the entropy loss. Our proof leverages a connection between linear hashing and the finite field Kakeya problem and extends some of the tools developed in this area, in particular the polynomial method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源