Benign, Tempered, or Catastrophic: A Taxonomy of Overfitting
The practical success of overparameterized neural networks has motivated the recent scientific study of interpolating methods, which perfectly fit their training data. Certain interpolating methods, including neural networks, can fit noisy training data without catastrophically bad test performance, in defiance of standard intuitions from statistical learning theory. Aiming to explain this, a body of recent …
Read more “Benign, Tempered, or Catastrophic: A Taxonomy of Overfitting”