| | š§µScaling up generative models is crucial to unlock new capabilities. But scaling down is equally necessary to democratize the end-to-end development of generative models. Excited to share our new work on scaling down diffusion generative models by drastically reducing the overhead of training them from scratch. Now anyone can train a stable-diffusion quality model from scratch in just $2,000 (2.6 training days on a single 8xH100 node). arxiv.org/abs/2407.15811 submitted by /u/1wndrla17 |
submitted by /u/sakalond [link] [comments]
Large language models (LLMs) are not only good at understanding and generating text; they can…
The initiative brings together some of the world's most prestigious research institutions to pioneer the…
Current speech translation systems, while having achieved impressive accuracies, are rather static in their behavior…
The vibe coding tool Cursor, from startup Anysphere, has introduced Composer, its first in-house, proprietary…
The second major cloud outage in less than two weeks, Azureās downtime highlights the ābrittlenessā…