Categories: FAANG

Momentum Approximation in Asynchronous Private Federated Learning

This paper was accepted for presentation at the International Workshop on Federated Foundation Models (FL@FM-NeurIPS’24), held in conjunction with NeurIPS 2024.
Asynchronous protocols have been shown to improve the scalability of federated learning (FL) with a massive number of clients. Meanwhile, momentum-based methods can achieve the best model quality in synchronous FL. However, naively applying momentum in asynchronous FL algorithms leads to slower convergence and degraded model performance. It is still unclear how to effective combinie these two techniques together to achieve a win-win…
AI Generated Robotic Content

Recent Posts

After ~400 Z-Image Turbo gens I finally figured out why everyone’s portraits look plastic

Been using Z-Image Turbo pretty heavily since it dropped and wanted to dump some notes…

3 hours ago

Evaluating Netflix Show Synopses with LLM-as-a-Judge

by Gabriela Alessio, Cameron Taylor, and Cameron R. WolfeIntroductionWhen members log into Netflix, one of the…

3 hours ago

How SAP Concur automates expense reporting with agentic AI

For decades, expense automation relied on a simple premise: If the machine can read the…

3 hours ago

Artemis II Returns From Historic Flight Around the Moon

After traveling a greater distance from Earth than any humans before them, the astronauts of…

4 hours ago