Categories: Image

Why are we still training LoRA and not moved to DoRA as a standard?

Just wondering, this has been a head-scratcher for me for a while.

Everywhere I look claims DoRA is superior to LoRA in what seems like all aspects. It doesn’t require more power or resources to train.

I googled DoRA training for newer models – Wan, Qwen, etc. Didn’t find anything, except a reddit post from a year ago asking pretty much exactly what I’m asking here today lol. And every comment seems to agree DoRA is superior. And Comfy has supported DoRA now for a long time.

Yet, here we are – still training LoRAs when there’s been a better option for years? This community is always fairly quick to adopt the latest and greatest. It’s odd this slipped through? I use diffusion-pipe to train pretty much everything now. I’m curious to know if theres a way I could train DoRAs with that. Or if there is a different method out there right now that is capable of training a wan DoRA.

Thanks for any insight, and curious to hear others opinions on this.

Edit: very insightful and interesting responses, my opinion has definitely shifted. @roger_ducky has a great explanation of DoRA drawbacks I was unaware of. Also cool to hear from people who had worse results than LoRA training using the same dataset/params. It sounds like sometimes LoRA is better, and sometimes DoRA is better, but DoRA is certainly not better in every instance – as I was initially led to believe. But still feels like DoRAs deserve more exploration and testing than they’ve had, especially with newer models.

submitted by /u/Realistic_Rabbit5429
[link] [comments]

AI Generated Robotic Content

Share
Published by
AI Generated Robotic Content
Tags: ai images

Recent Posts

No more Sora ..?

submitted by /u/Affectionate_Fee232 [link] [comments]

5 hours ago

Pentagon’s ‘Attempt to Cripple’ Anthropic Is Troubling, Judge Says

During a hearing Tuesday, a district court judge questioned the Department of Defense’s motivations for…

8 hours ago

Study finds AI privacy leaks hinge on a few high-impact neural network weights

Researchers have discovered that some of the elements of AI neural networks that contribute to…

8 hours ago

Beyond the Vector Store: Building the Full Data Layer for AI Applications

If you look at the architecture diagram of almost any AI startup today, you will…

8 hours ago

7 Steps to Mastering Memory in Agentic AI Systems

Memory is one of the most overlooked parts of agentic system design.

8 hours ago

Why Agents Fail: The Role of Seed Values and Temperature in Agentic Loops

In the modern AI landscape, an agent loop is a cyclic, repeatable, and continuous process…

8 hours ago