FOLDS seminar: Coherence Mechanisms for Provable Self-Improvement
Zoom link: https://upenn.zoom.us/j/98220304722 Large language models are increasingly trained to improve themselves, yet the mechanisms driving this, such as self-reflection or RLAIF, rely almost entirely on empirical heuristics. Is it possible […]
FOLDS seminar & PENN AI seminar: Optimization Challenges in Physics-Informed Neural Networks
Zoom link: https://upenn.zoom.us/j/98220304722 Physics-informed neural networks (PINNs) minimize composite losses that penalize PDE residuals alongside boundary and initial conditions. While this resembles multi-task learning, the optimization landscape is fundamentally different. […]
ESE Spring Seminar – “Uncompromising Performance with Exocompilation”
Performance is the currency of modern computing. Achieving peak throughput on fast‑evolving accelerators demands more control than today’s compilers provide. I will present Exo and the Exocompilation paradigm: a user‑schedulable […]
AI Research Seminar Series – “Toward Sustainable Data Centers for Artificial Intelligence”
As the impact of artificial intelligence (AI) continues to proliferate, computer architects must assess and mitigate its energy demands. This talk will survey strategies for mitigating the energy used by […]
AI Innovation Seminar Series – “From Bytes to Atoms to Excavators – An engineer’s journey to the physical world”
Tom Eliaz, Co-Founder of Bedrock Robotics (Penn Engineering Class of 2002), shares his journey and learnings from Penn and a career at IBM, multiple startups and exits, public company leadership, […]
FOLDS seminar: Multi-step reasoning via curriculum learning
Zoom link: https://upenn.zoom.us/j/98220304722 Can multi-step reasoning be learned from data? We investigate this question in the context of a simple function composition task. We prove that this task is […]
FOLDS seminar: Fast Convergence of High-Order ODE Solvers for Diffusion Models
Zoom link: https://upenn.zoom.us/j/98220304722 Score-based diffusion models can be sampled efficiently by reformulating the reverse dynamics as a deterministic probability flow ODE and integrating it with high-order solvers. Since the […]
FOLDS seminar: Transformers Meet In-Context Learning: A Universal Approximation Theory
Zoom link: https://upenn.zoom.us/j/98220304722 Large language models are capable of in-context learning, the ability to perform new tasks at test time using a handful of input-output examples, without parameter updates. […]
ASSET Seminar: “When do spectral gradient updates help in deep learning?”
Spectral gradient methods, such as the recently popularized Muon algorithm, are a promising alternative to standard Euclidean gradient descent for training deep neural networks and transformers, but it is still […]
FOLDS seminar: Provably Efficient Learning in Nonlinear Dynamical Systems via Spectral Transformers
Zoom link: https://upenn.zoom.us/j/98220304722 Learning in dynamical systems is a fundamental challenge underlying modern sequence modeling. Despite extensive study, efficient algorithms with formal guarantees for general nonlinear systems have remained elusive. This […]