Elastic Weight Consolidation Improves the Robustness of Self-Supervised Learning Methods under Transfer
This paper was accepted at the workshop “Self-Supervised Learning – Theory and Practice” at NeurIPS 2022. Self-supervised representation learning (SSL) methods provide an effective label-free initial condition for fine-tuning downstream tasks. However, in numerous realistic scenarios, the downstream task might be biased with respect to the target label distribution. This in turn moves the learned …