They confirmed that the SDXL weights won’t be released, and they’re probably going to do the same for the training code as well:
https://github.com/TencentQQGYLab/ELLA/issues/16#issuecomment-2046795891
I’m not being a cynic, but I’ve been in academia long enough to know how people (usually early in career) tend to (heavy stress on “tend to”) value the publication itself rather attempting to make its results readily or even generally available. I’ll even go far enough to say that this implies “things” as well.
The results and descriptions in the paper quite exclusively involve SDXL, yet the weights released are SD1.5. Furthermore, training code itself hasn’t been released either and quite likely won’t ever see the light ofof day. There’s something about making a publicized and documented claim which couldn’t have been achieved without PoC/evidence, and then scaling back on said claims and deliverables.
I get the weights being “a piece of investment” that’s being out “for free”, but the training code itself? Maybe I’m being a cynic, but just thought I’d let you all know that LaviBridge/ELLA’s route for prompt-adherence in SDXL, is probably dead – better to look to SD3 now, I think.
submitted by /u/hexinx
[link] [comments]
Our next iteration of the FSF sets out stronger security protocols on the path to…
Large neural networks pretrained on web-scale corpora are central to modern machine learning. In this…
Generative AI has revolutionized technology through generating content and solving complex problems. To fully take…
At Google Cloud, we're deeply invested in making AI helpful to organizations everywhere — not…
Advanced Micro Devices reported revenue of $7.658 billion for the fourth quarter, up 24% from…