Archive |

FOLDS seminar: Learning in Strategic Queuing

Zoom link: https://upenn.zoom.us/j/98220304722   Over the last two decades we have developed good understanding how to quantify the impact of strategic user behavior on outcomes in many games (including traffic routing […]

FOLDS Seminar: ACS: An interactive framework for machine-assisted selection with model-free guarantees

Zoom link: https://upenn.zoom.us/j/98220304722   In this talk, I will introduce adaptive conformal selection (ACS), an interactive framework for model-free selection with guaranteed error control. Building on conformal selection (Jin and Candès, […]

FOLDS seminar: Weak to Strong Generalization in Random Feature Models

Zoom link: https://upenn.zoom.us/j/98220304722   Weak-to-Strong Generalization (Burns et al., 2023) is the phenomenon whereby a strong student, say GPT-4, learns a task from a weak teacher, say GPT-2, and ends up […]

FOLDS seminar: An Information Geometric Understanding of Deep Learning

Zoom link: https://upenn.zoom.us/j/98220304722   I will argue that properties of natural data are what predominantly make deep networks so effective. To that end, I will show that deep networks work well […]

FOLDS seminar: A New Paradigm for Learning with Distribution Shift

Zoom link: https://upenn.zoom.us/j/98220304722   We revisit the fundamental problem of learning with distribution shift, where a learner is given labeled samples from training distribution D, unlabeled samples from test distribution D′ and […]

FOLDS seminar: Theory and practice of LLM quantization

Zoom link: https://upenn.zoom.us/j/98220304722   Modern LLMs process information by repeatedly applying a basic primitive of matrix multiplication. Estimates show that about 60-84% of the energy consumed by LLMs goes into […]

FOLDS seminar: Propagation-of-Chaos in Shallow Neural Networks beyond Logarithmic Time.

Zoom link: https://upenn.zoom.us/j/98220304722 The analysis of gradient-based learning of Neural Networks remains an outstanding challenge, even for the simplest shallow architectures.  A powerful mathematical framework that has emerged over recent […]

FOLDS seminar: Heaviside Composite Optimization: A new paradigm of optimization

Zoom link: https://upenn.zoom.us/j/98220304722   A Heaviside function is an indicator function of a semi-infinite interval. A Heaviside composite function is a Heaviside function composed with a multivariate function that may be […]

FOLDS seminar: Algorithmic stability for regression and classification

In a supervised learning setting, a model fitting algorithm is unstable if small perturbations to the input (the training data) can often lead to large perturbations in the output (say, […]

FOLDS Seminar: Positive random walks and positive-semidefinite random matrices

On the real line, a random walk that can only move in the positive direction is very unlikely to remain close to its starting point. After a fixed number of […]

Pages 1 2 3 4 5 6 12
Archive |