Categories: FAANG

Elastic Weight Consolidation Improves the Robustness of Self-Supervised Learning Methods under Transfer

This paper was accepted at the workshop “Self-Supervised Learning – Theory and Practice” at NeurIPS 2022.
Self-supervised representation learning (SSL) methods provide an effective label-free initial condition for fine-tuning downstream tasks. However, in numerous realistic scenarios, the downstream task might be biased with respect to the target label distribution. This in turn moves the learned fine-tuned model posterior away from the initial (label) bias-free self-supervised model posterior. In this work, we re-interpret SSL fine-tuning under the lens of Bayesian continual learning and…
AI Generated Robotic Content

Recent Posts

Building RAG Systems with Transformers

This post is divided into five parts: • Understanding the RAG architecture • Building the…

14 hours ago

Build an AI-powered document processing platform with open source NER model and LLM on Amazon SageMaker

Archival data in research institutions and national laboratories represents a vast repository of historical knowledge,…

14 hours ago

Going from requirements to prototype with Gemini Code Assist

Imagine this common scenario: you have a detailed product requirements document for your next project.…

14 hours ago

Google adds more AI tools to its Workspace productivity apps

Google expanded Gemini's features, adding the popular podcast-style feature Audio Overviews to the platform.Read More

15 hours ago

The Best N95, KF94, and KN95 Face Masks (2025)

Wildfire season is coming. Here are the best disposable face coverings we’ve tested—and where you…

15 hours ago

Engineering a robot that can jump 10 feet high — without legs

Inspired by the movements of a tiny parasitic worm, engineers have created a 5-inch soft…

15 hours ago