Training Diffusion Models with micro-budget (8xH100) in only 3 days!

Training Diffusion Models with micro-budget (8xH100) in only 3 days!

🧵Scaling up generative models is crucial to unlock new capabilities. But scaling down is equally necessary to democratize the end-to-end development of generative models.

Excited to share our new work on scaling down diffusion generative models by drastically reducing the overhead of training them from scratch.

Now anyone can train a stable-diffusion quality model from scratch in just $2,000 (2.6 training days on a single 8xH100 node).

arxiv.org/abs/2407.15811

submitted by /u/1wndrla17
[link] [comments]