论文标题

遗憾的是,嘈杂的观察最小化

Regret Minimization with Noisy Observations

论文作者

Mahdian, Mohammad, Mao, Jieming, Wang, Kangning

论文摘要

在典型的优化问题中,任务是选择成本最低或最高价值的多个选项之一。实际上,这些成本/价值数量通常是通过诸如嘈杂的测量或机器学习等过程来实现的,具有可量化的噪声分布。要考虑到这些噪声分布,一种方法是假设值的先验,使用它来构建后部,然后应用标准随机优化来选择解决方案。但是,在许多实际应用中,此类先前的分布可能无法使用。在本文中,我们使用遗憾最小化模型研究了这种情况。 在我们的模型中,任务是在$ n $值中选择最高的一个。这些值是未知的,并且是由对手选择的,但是可以通过嘈杂的通道观察到,在噪声通道中,从已知的分布开始添加噪声。目的是最大程度地减少我们选择的遗憾,这定义为最高值选择的最高值和所选值之间的预期差异。我们表明,即使$ n = 2 $,挑选最高观测值的天真算法比最佳距离遗憾的是,遗憾的是,即使$ n = 2 $,而且噪声在预期方面没有偏见。另一方面,我们提出了一种算法,该算法对任何$ n $造成了最佳的遗憾。我们的算法在概念上是简单的,在计算上是有效的,并且仅需要对噪声分布的最小知识。

In a typical optimization problem, the task is to pick one of a number of options with the lowest cost or the highest value. In practice, these cost/value quantities often come through processes such as measurement or machine learning, which are noisy, with quantifiable noise distributions. To take these noise distributions into account, one approach is to assume a prior for the values, use it to build a posterior, and then apply standard stochastic optimization to pick a solution. However, in many practical applications, such prior distributions may not be available. In this paper, we study such scenarios using a regret minimization model. In our model, the task is to pick the highest one out of $n$ values. The values are unknown and chosen by an adversary, but can be observed through noisy channels, where additive noises are stochastically drawn from known distributions. The goal is to minimize the regret of our selection, defined as the expected difference between the highest and the selected value on the worst-case choices of values. We show that the naïve algorithm of picking the highest observed value has regret arbitrarily worse than the optimum, even when $n = 2$ and the noises are unbiased in expectation. On the other hand, we propose an algorithm which gives a constant-approximation to the optimal regret for any $n$. Our algorithm is conceptually simple, computationally efficient, and requires only minimal knowledge of the noise distributions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源