论文标题

线性分支程序和定向仿期提取器

Linear Branching Programs and Directional Affine Extractors

论文作者

Gryaznov, Svyatoslav, Pudlák, Pavel, Talebanfard, Navid

论文摘要

读取对立线性分支程序的自然模型是一个分支程序,其中查询为$ \ mathbb {f} _2 $线性表单,并且沿着每个路径,查询是线性独立的。我们考虑了该模型的两个限制,我们称之为弱且强烈的阅读范围,既概括了标准的读取分支程序和平等决策树。我们的主要结果如下。 - 平均案例复杂性。我们定义了一个伪随机类别的函数类别,我们称之为定向仿射提取器,并表明这些功能对于强读的模型平均而言很难。然后,我们提供具有良好参数的这种功能的明确结构。这加强了科恩(Cohen)和新克(ITCS'16)的结果,他们给了平均案例决策树的平均硬度。定向仿生提取器比更熟悉的仿期提取器更强大。鉴于这些功能的重要性,我们期望我们的新功能可能具有独立的利益。 - 证明复杂性。我们还考虑了证明系统$ \ text {res} [\ oplus] $,这是带有线性查询的分辨率的扩展。该证明系统中CNF的反驳自然定义了解决相应搜索问题的线性分支程序。相反,我们表明,较弱的读取线性BP解决搜索问题可以转换为$ \ text {res} [\ oplus] $ consance command Bolling。

A natural model of read-once linear branching programs is a branching program where queries are $\mathbb{F}_2$ linear forms, and along each path, the queries are linearly independent. We consider two restrictions of this model, which we call weakly and strongly read-once, both generalizing standard read-once branching programs and parity decision trees. Our main results are as follows. - Average-case complexity. We define a pseudo-random class of functions which we call directional affine extractors, and show that these functions are hard on average for the strongly read-once model. We then present an explicit construction of such function with good parameters. This strengthens the result of Cohen and Shinkar (ITCS'16) who gave such average-case hardness for parity decision trees. Directional affine extractors are stronger than the more familiar class of affine extractors. Given the significance of these functions, we expect that our new class of functions might be of independent interest. - Proof complexity. We also consider the proof system $\text{Res}[\oplus]$ which is an extension of resolution with linear queries. A refutation of a CNF in this proof system naturally defines a linear branching program solving the corresponding search problem. Conversely, we show that a weakly read-once linear BP solving the search problem can be converted to a $\text{Res}[\oplus]$ refutation with constant blow up.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源