论文标题

分布式推断,稀疏和量化的通信

Distributed Inference with Sparse and Quantized Communication

论文作者

Mitra, Aritra, Richards, John A., Bagchi, Saurabh, Sundaram, Shreyas

论文摘要

我们考虑了分布式推理的问题,即网络中的代理观察未知状态产生的私人信号流,并旨在从一组有限的假设中唯一地识别该状态。我们专注于代理之间沟通昂贵的方案,并在有限带宽的频道上进行。为了减少交流的频率,我们开发了一种新型事件触发的分布式学习规则,该规则基于每个虚假假设散布低信念的原理。在此原理的基础上,我们设计了一个触发条件,在该条件下,代理只广播其具有足够创新的信念向量的组成部分,只向那些需要此类信息的邻居。我们证明,尽管沟通很少,但我们的规则几乎肯定会肯定地呈指数迅速地融合到真实状态,并且它有可能显着减少从非信息性药物到翔实代理的信息流量。接下来,为了处理有限的沟通渠道,我们提出了一个分布式学习规则,该规则利用自适应量化的想法。我们表明,通过依次完善量化器的范围,每个代理几乎可以肯定地肯定地学习真相,同时仅使用$ 1 $ bit来编码其对每个假设的信念。对于我们提出的算法,我们严格地表征了沟通效率和学习率之间的权衡。

We consider the problem of distributed inference where agents in a network observe a stream of private signals generated by an unknown state, and aim to uniquely identify this state from a finite set of hypotheses. We focus on scenarios where communication between agents is costly, and takes place over channels with finite bandwidth. To reduce the frequency of communication, we develop a novel event-triggered distributed learning rule that is based on the principle of diffusing low beliefs on each false hypothesis. Building on this principle, we design a trigger condition under which an agent broadcasts only those components of its belief vector that have adequate innovation, to only those neighbors that require such information. We prove that our rule guarantees convergence to the true state exponentially fast almost surely despite sparse communication, and that it has the potential to significantly reduce information flow from uninformative agents to informative agents. Next, to deal with finite-precision communication channels, we propose a distributed learning rule that leverages the idea of adaptive quantization. We show that by sequentially refining the range of the quantizers, every agent can learn the truth exponentially fast almost surely, while using just $1$ bit to encode its belief on each hypothesis. For both our proposed algorithms, we rigorously characterize the trade-offs between communication-efficiency and the learning rate.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源