论文标题

带有预处理扩散采样的基于得分的生成模型加速

Accelerating Score-based Generative Models with Preconditioned Diffusion Sampling

论文作者

Ma, Hengyuan, Zhang, Li, Zhu, Xiatian, Feng, Jianfeng

论文摘要

基于得分的生成模型(SGM)最近成为了有希望的生成模型类别。但是,一个基本的限制是,由于需要许多迭代(例如,2000年)的迭代,它们的推论非常慢。直观的加速方法是减少采样迭代,但导致严重的性能降解。我们通过将扩散抽样过程视为大都市调整后的Langevin算法来研究这个问题,这有助于揭示出根本的原因是条件不足的曲率。在这种见解下,我们提出了一种模型不合时宜的预处理扩散采样(PDS)方法,该方法利用矩阵预处理以减轻上述问题。至关重要的是,在理论上证明了PDS可以收敛到SGM的原始目标分布,而无需重新训练。在三个图像数据集上进行了各种分辨率和多样性的大量实验,可以验证PD始终加速现成的SGMS,同时保持合成质量。特别是,在更具挑战性的高分辨率(1024x1024)图像生成方面,PD可以加速29倍。

Score-based generative models (SGMs) have recently emerged as a promising class of generative models. However, a fundamental limitation is that their inference is very slow due to a need for many (e.g., 2000) iterations of sequential computations. An intuitive acceleration method is to reduce the sampling iterations which however causes severe performance degradation. We investigate this problem by viewing the diffusion sampling process as a Metropolis adjusted Langevin algorithm, which helps reveal the underlying cause to be ill-conditioned curvature. Under this insight, we propose a model-agnostic preconditioned diffusion sampling (PDS) method that leverages matrix preconditioning to alleviate the aforementioned problem. Crucially, PDS is proven theoretically to converge to the original target distribution of a SGM, no need for retraining. Extensive experiments on three image datasets with a variety of resolutions and diversity validate that PDS consistently accelerates off-the-shelf SGMs whilst maintaining the synthesis quality. In particular, PDS can accelerate by up to 29x on more challenging high resolution (1024x1024) image generation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源