论文标题

在机器学习中应用的suppodular组合信息测量

Submodular Combinatorial Information Measures with Applications in Machine Learning

论文作者

Iyer, Rishabh, Khargonkar, Ninad, Bilmes, Jeff, Asnani, Himanshu

论文摘要

信息理论量(例如熵和共同信息)在机器学习中发现了许多用途。众所周知,由于一组随机变量的熵是集中的,因此这些熵数量和下二次性之间存在牢固的联系。在本文中,我们研究了概括独立性,(条件)熵,(条件)相互信息的组合信息度量,以及定义(不一定是随机)变量集的总相关性。这些度量严格概括了相应的熵度量,因为它们都是通过严格概括熵的下函数进行参数化的。至关重要的是,我们表明,与一般的熵相互信息不同,在一个参数中,持有另一个固定的一个固定函数实际上是下二阶的,对于大量的三阶部分衍生物,其符合非阴性特性。事实证明,这包括许多实际有用的案例,例如设施位置和设置覆盖功能。我们研究了有关这些信息的特定实例,以及概率覆盖,图形和饱和覆盖范围功能,并看到它们都具有数学上直观且实际上有用的表达方式。关于应用程序,我们将子模型(条件)相互信息的最大化连接到诸如基于基于查询的基于查询和隐私的摘要等问题 - 我们将优化的多集subperular互助信息连接到聚类和鲁棒分区。

Information-theoretic quantities like entropy and mutual information have found numerous uses in machine learning. It is well known that there is a strong connection between these entropic quantities and submodularity since entropy over a set of random variables is submodular. In this paper, we study combinatorial information measures that generalize independence, (conditional) entropy, (conditional) mutual information, and total correlation defined over sets of (not necessarily random) variables. These measures strictly generalize the corresponding entropic measures since they are all parameterized via submodular functions that themselves strictly generalize entropy. Critically, we show that, unlike entropic mutual information in general, the submodular mutual information is actually submodular in one argument, holding the other fixed, for a large class of submodular functions whose third-order partial derivatives satisfy a non-negativity property. This turns out to include a number of practically useful cases such as the facility location and set-cover functions. We study specific instantiations of the submodular information measures on these, as well as the probabilistic coverage, graph-cut, and saturated coverage functions, and see that they all have mathematically intuitive and practically useful expressions. Regarding applications, we connect the maximization of submodular (conditional) mutual information to problems such as mutual-information-based, query-based, and privacy-preserving summarization -- and we connect optimizing the multi-set submodular mutual information to clustering and robust partitioning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源