论文标题
关节(数据,条件)分布的最大似然,以解决条件流模型的不适当问题
Maximum Likelihood on the Joint (Data, Condition) Distribution for Solving Ill-Posed Problems with Conditional Flow Models
论文作者
论文摘要
我将使用规定规则作为替代物的训练流模型描述了一种技巧,以最大程度的可能性。此技巧的实用性仅限于非条件模型,但是该方法的扩展应用于数据和条件信息的最大可能性分布的最大可能性,可用于训练复杂的\ textit \ textit {条件{条件}流模型。与以前的方法不同,此方法非常简单:它不需要明确了解条件分布,辅助网络或其他特定体系结构,或者不需要超出最大似然的其他损失项,并且可以保留潜在空间和数据空间之间的对应关系。所得模型具有非条件流模型的所有属性,对意外输入具有鲁棒性,并且可以预测在给定输入下的解决方案的分布。它们可以保证预测代表性,并且是解决高度不确定问题的自然而有力的方法。我在易于可视化的玩具问题上演示了这些属性,然后使用该方法成功生成类条件图像并通过超分辨率重建高度退化的图像。
I describe a trick for training flow models using a prescribed rule as a surrogate for maximum likelihood. The utility of this trick is limited for non-conditional models, but an extension of the approach, applied to maximum likelihood of the joint probability distribution of data and conditioning information, can be used to train sophisticated \textit{conditional} flow models. Unlike previous approaches, this method is quite simple: it does not require explicit knowledge of the distribution of conditions, auxiliary networks or other specific architecture, or additional loss terms beyond maximum likelihood, and it preserves the correspondence between latent and data spaces. The resulting models have all the properties of non-conditional flow models, are robust to unexpected inputs, and can predict the distribution of solutions conditioned on a given input. They come with guarantees of prediction representativeness and are a natural and powerful way to solve highly uncertain problems. I demonstrate these properties on easily visualized toy problems, then use the method to successfully generate class-conditional images and to reconstruct highly degraded images via super-resolution.