计算机科学
概率逻辑
可扩展性
扩散
重要性抽样
编码(集合论)
样品(材料)
采样(信号处理)
算法
质量(理念)
降噪
机器学习
人工智能
数据挖掘
蒙特卡罗方法
统计
数学
认识论
物理
滤波器(信号处理)
哲学
热力学
数据库
集合(抽象数据类型)
化学
色谱法
程序设计语言
计算机视觉
作者
Alex Nichol,Prafulla Dhariwal
出处
期刊:Cornell University - arXiv
日期:2021-02-18
被引量:28
摘要
Denoising diffusion probabilistic models (DDPM) are a class of generative models which have recently been shown to produce excellent samples. We show that with a few simple modifications, DDPMs can also achieve competitive log-likelihoods while maintaining high sample quality. Additionally, we find that learning variances of the reverse diffusion process allows sampling with an order of magnitude fewer forward passes with a negligible difference in sample quality, which is important for the practical deployment of these models. We additionally use precision and recall to compare how well DDPMs and GANs cover the target distribution. Finally, we show that the sample quality and likelihood of these models scale smoothly with model capacity and training compute, making them easily scalable. We release our code at this https URL
科研通智能强力驱动
Strongly Powered by AbleSci AI