IDEAS/STAT Optimization Seminar: “Foundations of Deep Learning: Optimization and Representation Learning”

Amy Gutmann Hall, Room 414 3333 Chestnut Street, Philadelphia, United States

Deep learning's success stems from the ability of neural networks to automatically discover meaningful representations from raw data. In this talk, I will describe some recent insights into how optimization enables this learning process. First, I will show how optimization algorithms exhibit surprisingly rich dynamics when training neural networks, and how these complex dynamics are […]

IDEAS/STAT Optimization Seminar: “Theoretical foundations for multi-agent learning”

Amy Gutmann Hall, Room 414 3333 Chestnut Street, Philadelphia, United States

As learning algorithms become increasingly capable of acting autonomously, it is important to better understand the behavior that results from their interactions. For example, a pervasive challenge in multi-agent learning settings, which spans both theory and practice and dates back decades, has been the failure of convergence for iterative algorithms such as gradient descent. Accordingly, […]

IDEAS/STAT Optimization Seminar: “ML for an Interactive World: From Learning to Unlearning”

Amy Gutmann Hall, Room 414 3333 Chestnut Street, Philadelphia, United States

The remarkable recent success of Machine Learning (ML) is driven by our ability to develop and deploy interactive models that can solve complicated tasks by understanding and adapting to the ever-changing state of the world. However, the development of such models demands significant data and computing resources. Moreover, as these models increasingly interact with humans, […]

IDEAS/STAT Optimization Seminar: “Data-Driven Algorithm Design and Verification for Parametric Convex Optimization”

Amy Gutmann Hall, Room 414 3333 Chestnut Street, Philadelphia, United States

Zoom link https://upenn.zoom.us/j/98220304722   Abstract We present computational tools for analyzing and designing first-order methods in parametric convex optimization. These methods are popular for their low per-iteration cost and warm-starting capabilities. However, precisely quantifying the number of iterations required to compute high-quality solutions remains a key challenge, especially in real-time applications. First, we introduce a […]

IDEAS/STAT Optimization Seminar: “Statistics-Powered ML: Building Trust and Robustness in Black-Box Predictions”

Amy Gutmann Hall, Room 414 3333 Chestnut Street, Philadelphia, United States

Zoom link: https://upenn.zoom.us/j/98220304722 Abstract: Modern ML models produce valuable predictions across various applications, influencing people’s lives, opportunities, and scientific advancements. However, these systems can fail in unexpected ways, generating unreliable inferences and perpetuating biases present in the data. These issues are particularly troubling in high-stakes applications, where models are trained on increasingly diverse, incomplete, and […]

IDEAS/STAT Optimization Seminar: “The Size of Teachers as a Measure of Data Complexity: PAC-Bayes Excess Risk Bounds and Scaling Laws”

Amy Gutmann Hall, Room 414 3333 Chestnut Street, Philadelphia, United States

Zoom link: https://upenn.zoom.us/j/98220304722 Abstract: We study the generalization properties of neural networks through the lens of data complexity.  Recent work by Buzaglo et al. (2024) shows that random (nearly) interpolating networks generalize, provided there is a small ``teacher'' network that achieves small excess risk. We give a short single-sample PAC-Bayes proof of this result and […]

IDEAS/STAT Optimization Seminar: “Stochastic-Gradient-based Algorithms for Solving Nonconvex Constrained Optimization Problems”

Amy Gutmann Hall, Room 615

Zoom link: https://upenn.zoom.us/j/98220304722   Abstract I will present recent work by my research group on the design and analysis of stochastic-gradient-based algorithms for solving nonconvex constrained optimization problems, which may arise, for example, in informed supervised learning.  I will focus in particular on algorithmic strategies that have consistently been shown to exhibit the best practical […]

IDEAS/STAT Optimization Seminar: “Gradient Equilibrium in Online Learning”

Amy Gutmann Hall, Room 414 3333 Chestnut Street, Philadelphia, United States

We present a new perspective on online learning that we refer to as gradient equilibrium: a sequence of iterates achieves gradient equilibrium if the average of gradients of losses along the sequence converges to zero. In general, this condition is not implied by, nor implies, sublinear regret. It turns out that gradient equilibrium is achievable […]

IDEAS/STAT Optimization Seminar: Resilient Distributed Optimization for Cyberphysical Systems

Amy Gutmann Hall, Room 414 3333 Chestnut Street, Philadelphia, United States

Zoom link: https://upenn.zoom.us/j/98220304722   Abstract: This talk considers the problem of resilient distributed multi-agent optimization for cyberphysical systems in the presence of malicious or non-cooperative agents. It is assumed that stochastic values of trust between agents are available which allows agents to learn their trustworthy neighbors simultaneously with performing updates to minimize their own local […]

IDEAS/STAT Optimization Seminar: “Negative Stepsizes Make Gradient-Descent-Ascent Converge”

Amy Gutmann Hall, Room 414 3333 Chestnut Street, Philadelphia, United States

Zoom link: https://upenn.zoom.us/j/98220304722 Abstract: Solving min-max problems is a central question in optimization, games, learning, and controls. Arguably the most natural algorithm is Gradient-Descent-Ascent (GDA), however since the 1970s, conventional wisdom has argued that it fails to converge even on simple problems. This failure spurred the extensive literature on modifying GDA with extragradients, optimism, momentum, anchoring, […]