IDEAS/STAT Optimization Seminar: “Theoretical foundations for multi-agent learning”

Amy Gutmann Hall, Room 414 3333 Chestnut Street, Philadelphia, United States

As learning algorithms become increasingly capable of acting autonomously, it is important to better understand the behavior that results from their interactions. For example, a pervasive challenge in multi-agent learning settings, which spans both theory and practice and dates back decades, has been the failure of convergence for iterative algorithms such as gradient descent. Accordingly, […]

MEAM Seminar: “Neural Operator for Scientific Computing”

Wu and Chen Auditorium (Room 101), Levine Hall 3330 Walnut Street, Philadelphia, PA, United States

Accurate simulations of physical phenomena governed by partial differential equations (PDEs) are foundational to scientific computing. While traditional numerical methods have proven effective, they remain computationally intensive, particularly for complex, large-scale systems. This talk introduces the neural operator, a machine learning framework that approximates solution operators in infinite-dimensional spaces, enabling efficient and scalable PDE simulations […]

IDEAS/STAT Optimization Seminar: “ML for an Interactive World: From Learning to Unlearning”

Amy Gutmann Hall, Room 414 3333 Chestnut Street, Philadelphia, United States

The remarkable recent success of Machine Learning (ML) is driven by our ability to develop and deploy interactive models that can solve complicated tasks by understanding and adapting to the ever-changing state of the world. However, the development of such models demands significant data and computing resources. Moreover, as these models increasingly interact with humans, […]

IDEAS/STAT Optimization Seminar: “Data-Driven Algorithm Design and Verification for Parametric Convex Optimization”

Amy Gutmann Hall, Room 414 3333 Chestnut Street, Philadelphia, United States

Zoom link https://upenn.zoom.us/j/98220304722   Abstract We present computational tools for analyzing and designing first-order methods in parametric convex optimization. These methods are popular for their low per-iteration cost and warm-starting capabilities. However, precisely quantifying the number of iterations required to compute high-quality solutions remains a key challenge, especially in real-time applications. First, we introduce a […]

ESE Spring Seminar – “Generalization, Memorization, and Privacy in Trustworthy Machine Learning”

Raisler Lounge (Room 225), Towne Building 220 South 33rd Street, Philadelphia, PA, United States

Machine learning is transforming numerous aspects of modern society, and its expanding use in high-stakes applications calls for responsible development. In this talk, I will present my research on the foundations and methodologies for building trustworthy ML, centered on three interconnected challenges: generalization, memorization, and privacy. First, I will show how information-theoretic tools can be […]

ESE Spring Seminar – “Can Robots Learn from Machine Dreams? – Robot Learning via GenAI-powered World Models”

Raisler Lounge (Room 225), Towne Building 220 South 33rd Street, Philadelphia, PA, United States

Over the past decade, large-scale pre-training followed by alignment has revolutionized natural language processing and computer vision. Yet, robotics remains constrained by the scarcity of real-world data. In this talk, I will present our systematic approach to overcoming this bottleneck by building increasingly rich world models from data. I will first introduce our distilled feature […]

IDEAS/STAT Optimization Seminar: “Statistics-Powered ML: Building Trust and Robustness in Black-Box Predictions”

Amy Gutmann Hall, Room 414 3333 Chestnut Street, Philadelphia, United States

Zoom link: https://upenn.zoom.us/j/98220304722 Abstract: Modern ML models produce valuable predictions across various applications, influencing people’s lives, opportunities, and scientific advancements. However, these systems can fail in unexpected ways, generating unreliable inferences and perpetuating biases present in the data. These issues are particularly troubling in high-stakes applications, where models are trained on increasingly diverse, incomplete, and […]

IDEAS/STAT Optimization Seminar: “The Size of Teachers as a Measure of Data Complexity: PAC-Bayes Excess Risk Bounds and Scaling Laws”

Amy Gutmann Hall, Room 414 3333 Chestnut Street, Philadelphia, United States

Zoom link: https://upenn.zoom.us/j/98220304722 Abstract: We study the generalization properties of neural networks through the lens of data complexity.  Recent work by Buzaglo et al. (2024) shows that random (nearly) interpolating networks generalize, provided there is a small ``teacher'' network that achieves small excess risk. We give a short single-sample PAC-Bayes proof of this result and […]

IDEAS/STAT Optimization Seminar: “Stochastic-Gradient-based Algorithms for Solving Nonconvex Constrained Optimization Problems”

Amy Gutmann Hall, Room 615

Zoom link: https://upenn.zoom.us/j/98220304722   Abstract I will present recent work by my research group on the design and analysis of stochastic-gradient-based algorithms for solving nonconvex constrained optimization problems, which may arise, for example, in informed supervised learning.  I will focus in particular on algorithmic strategies that have consistently been shown to exhibit the best practical […]

IDEAS/STAT Optimization Seminar: “Gradient Equilibrium in Online Learning”

Amy Gutmann Hall, Room 414 3333 Chestnut Street, Philadelphia, United States

We present a new perspective on online learning that we refer to as gradient equilibrium: a sequence of iterates achieves gradient equilibrium if the average of gradients of losses along the sequence converges to zero. In general, this condition is not implied by, nor implies, sublinear regret. It turns out that gradient equilibrium is achievable […]