Categories: FAANG

TRACT: Denoising Diffusion Models with Transitive Closure Time-Distillation

Denoising Diffusion models have demonstrated their proficiency for generative sampling. However, generating good samples often requires many iterations. Consequently, techniques such as binary time-distillation (BTD) have been proposed to reduce the number of network calls for a fixed architecture. In this paper, we introduce TRAnsitive Closure Time-distillation (TRACT), a new method that extends BTD. For single step diffusion,TRACT improves FID by up to 2.4x on the same architecture, and achieves new single-step Denoising Diffusion Implicit Models (DDIM) state-of-the-art FID (7.4 for…
AI Generated Robotic Content

Recent Posts

Average ComfyUI user

submitted by /u/wutzebaer [link] [comments]

18 hours ago

7 Concepts Behind Large Language Models Explained in 7 Minutes

If you've been using large language models like GPT-4 or Claude, you've probably wondered how…

18 hours ago

Interpolation in Positional Encodings and Using YaRN for Larger Context Window

This post is divided into three parts; they are: • Interpolation and Extrapolation in Sinusoidal…

18 hours ago

How to Combine Scikit-learn, CatBoost, and SHAP for Explainable Tree Models

Machine learning workflows often involve a delicate balance: you want models that perform exceptionally well,…

18 hours ago

Gemini 2.5: Updates to our family of thinking models

Explore the latest Gemini 2.5 model updates with enhanced performance and accuracy: Gemini 2.5 Pro…

18 hours ago

How Anomalo solves unstructured data quality issues to deliver trusted assets for AI with AWS

This post is co-written with Vicky Andonova and Jonathan Karon from Anomalo. Generative AI has…

18 hours ago