Diffusion Models With Learned
Adaptive Noise

Cornell Tech, NY.
NeurIPS 2024 (spotlight)
MY ALT TEXT

(Left) Comparison of noise schedule properties: Multivariate Learned Adaptive Noise schedule (MuLAN) (ours) versus a typical scalar noise schedule. Unlike scalar noise schedules, MuLAN's multivariate and input-adaptive properties improve likelihood. (Right) Likelihood in bits-per-dimension (BPD) on CIFAR-10 without data augmentation.

Abstract

Diffusion models have gained traction as powerful algorithms for synthesizing high-quality images. Central to these algorithms is the diffusion process, a set of equations which maps data to noise in a way that can significantly affect performance. In this paper, we explore whether the diffusion process can be learned from data. Our work is grounded in Bayesian inference and seeks to improve log-likelihood estimation by casting the learned diffusion process as an approximate variational posterior that yields a tighter lower bound (ELBO) on the likelihood. A widely held assumption is that the ELBO is invariant to the noise process: our work dispels this assumption and proposes multivariate learned adaptive noise (MuLAN), a learned diffusion process that applies noise at different rates across an image. Specifically, our method relies on a multivariate noise schedule that is a function of the data to ensure that the ELBO is no longer invariant to the choice of the noise schedule as in previous works. Empirically, MuLAN sets a new state-of-the-art in density estimation on CIFAR-10 and ImageNet and reduces the number of training steps by 50%.

Poster

BibTeX

@inproceedings{sahoo2024diffusion,
  title={Diffusion Models With Learned Adaptive Noise},
  author={Subham Sekhar Sahoo and Aaron Gokaslan and Christopher De Sa and Volodymyr Kuleshov},
  booktitle={The Thirty-eighth Annual Conference on Neural Information Processing Systems},
  year={2024},
  url={https://openreview.net/forum?id=loMa99A4p8}}