Categories: FAANG

Elastic Weight Consolidation Improves the Robustness of Self-Supervised Learning Methods under Transfer

This paper was accepted at the workshop “Self-Supervised Learning – Theory and Practice” at NeurIPS 2022.
Self-supervised representation learning (SSL) methods provide an effective label-free initial condition for fine-tuning downstream tasks. However, in numerous realistic scenarios, the downstream task might be biased with respect to the target label distribution. This in turn moves the learned fine-tuned model posterior away from the initial (label) bias-free self-supervised model posterior. In this work, we re-interpret SSL fine-tuning under the lens of Bayesian continual learning and…
AI Generated Robotic Content

Recent Posts

Open source CRT animation lora for ltx 2.3

None of the video gen models do a real CRT terminal animation look. Weights +…

22 hours ago

Getting Started with Zero-Shot Text Classification

Zero-shot text classification is a way to label text without first training a classifier on…

22 hours ago

Gradient-based Planning for World Models at Longer Horizons

GRASP is a new gradient-based planner for learned dynamics (a “world model”) that makes long-horizon…

22 hours ago

What Do Your Logits Know? (The Answer May Surprise You!)

Recent work has shown that probing model internals can reveal a wealth of information not…

22 hours ago

Accelerate Generative AI Inference on Amazon SageMaker AI with G7e Instances

As the demand for generative AI continues to grow, developers and enterprises seek more flexible,…

22 hours ago

A Humanoid Robot Set a Half-Marathon Record in China

An autonomous robot from the company Honor ran a half marathon in 50:26, beating the…

23 hours ago