Many people reported that the lora training sucks for z-image base. Less than 12 hours ago, someone on Bilibili claimed that he/she found the cause – unit 8 used by AdamW8bit optimizer. According to the author, you have to use FP8 optimizer for z-image base. The author pasted some comparisons in his/her post. One can check check https://b23.tv/g7gUFIZ for more info.
submitted by /u/Recent-Source-7777
[link] [comments]
When building machine learning models, training is only half the journey.
Marketing teams face major challenges creating campaigns in today’s digital environment. They must navigate through…
During a hearing at the US Senate, Netflix co-CEO Ted Sarandos said the company is…
An agentic AI tool for battery researchers harnesses data from previous battery designs to predict…