Generalization and Overfitting

May 28, 2024
Philosophy of computational sciences, High performance computing center

Nobelstraße 19
Stuttgart
Germany

This event is available both online and in-person

Organisers:

Universität Stuttgart
University of Stuttgart

Topic areas

Talks at this conference

Add a talk

Details

A huge part of the recent success of highly parametrized ML models is due to their apparent ability to generalize to unseen data. This ability is seemingly in tension with mathematical results from traditional statistics (e.g. bias-variance trade-off) and statistical learning theory (e.g. PAC theorems) which rely heavily on either strong assumptions about the underlying probability distribution or restrictions on the hypothesis class. The predominant engineering epistemology claims failure of ML theory and suggests that contemporary ML models generalize well even beyond the classical overfitting regime.

This workshop aims to shed light at the generalization overfit tension. For more information and a schedule please see our website https://philo.hlrs.de/?p=415.

Supporting material

Add supporting material (slides, programs, etc.)

Reminders

Registration

No

Who is attending?

No one has said they will attend yet.

Will you attend this event?


Let us know so we can notify you of any change of plan.