Categories: FAANG

Elastic Weight Consolidation Improves the Robustness of Self-Supervised Learning Methods under Transfer

This paper was accepted at the workshop “Self-Supervised Learning – Theory and Practice” at NeurIPS 2022.
Self-supervised representation learning (SSL) methods provide an effective label-free initial condition for fine-tuning downstream tasks. However, in numerous realistic scenarios, the downstream task might be biased with respect to the target label distribution. This in turn moves the learned fine-tuned model posterior away from the initial (label) bias-free self-supervised model posterior. In this work, we re-interpret SSL fine-tuning under the lens of Bayesian continual learning and…
AI Generated Robotic Content

Recent Posts

This sub right now

submitted by /u/ArtificialAnaleptic [link] [comments]

17 hours ago

Best Black Friday Deals 2025: We’ve Tested Every Item and Tracked Every Price

Our Reviews team has scoured the entire internet to find the best Black Friday deals…

18 hours ago

New insight into why LLMs are not great at cracking passwords

Large language models (LLMs), such as the model underpinning the functioning of OpenAI's conversational platform…

18 hours ago

The Journey of a Token: What Really Happens Inside a Transformer

Large language models (LLMs) are based on the transformer architecture, a complex deep neural network…

2 days ago

Pretrain a BERT Model from Scratch

This article is divided into three parts; they are: • Creating a BERT Model the…

2 days ago