Categories: Image

Radial Attention: O(nlogn) Sparse Attention with Energy Decay for Long Video Generation

We just released RadialAttention, a sparse attention mechanism with O(nlog⁡n) computational complexity for long video generation.

🔍 Key Features:

  • ✅ Plug-and-play: works with pretrained models like #Wan, #HunyuanVideo, #Mochi
  • ✅ Speeds up both training&inference by 2–4×, without quality loss

All you need is a pre-defined static attention mask!

ComfyUI integration is in progress and will be released in ComfyUI-nunchaku!

Paper: https://arxiv.org/abs/2506.19852

Code: https://hanlab.mit.edu/projects/radial-attention

Website: https://hanlab.mit.edu/projects/radial-attention

https://reddit.com/link/1lpfhfk/video/1v2gnr929caf1/player

submitted by /u/Dramatic-Cry-417
[link] [comments]

AI Generated Robotic Content

Share
Published by
AI Generated Robotic Content
Tags: ai images

Recent Posts

Teaching the model: Designing LLM feedback loops that get smarter over time

How to close the loop between user behavior and LLM performance, and why human-in-the-loop systems…

7 hours ago

I Tried the Best At-Home Pet DNA Test Kits on My Two Cats (2025)

I sent my cats' saliva to the lab to get health and genetic insights sent…

7 hours ago

Wan LoRa that creates hyper-realistic people just got an update

The Instagirl Wan LoRa was just updated to v2.3. It was retrained to be better…

1 day ago

Vibe Coding is Shoot-and-Forget Coding

TL;DR Vibe coding is great for quick hacks; lasting software still needs real engineers. Vibe…

1 day ago

Scaling On-Prem Security at Palantir

How Insight, Foundry & Apollo Keep Thousands of Servers in CheckIntroductionWhen it comes to Palantir’s on-premises…

1 day ago

Introducing Amazon Bedrock AgentCore Gateway: Transforming enterprise AI agent tool development

To fulfill their tasks, AI Agents need access to various capabilities including tools, data stores,…

1 day ago