ASSET Seminar: “When do spectral gradient updates help in deep learning?”
Spectral gradient methods, such as the recently popularized Muon algorithm, are a promising alternative to standard Euclidean gradient descent for training deep neural networks and transformers, but it is still […]
FOLDS seminar: Provably Efficient Learning in Nonlinear Dynamical Systems via Spectral Transformers
Zoom link: https://upenn.zoom.us/j/98220304722 Learning in dynamical systems is a fundamental challenge underlying modern sequence modeling. Despite extensive study, efficient algorithms with formal guarantees for general nonlinear systems have remained elusive. This […]
ASSET Seminar: “Better Algorithms for Better Neighbors”
Nearest neighbor search has a long history in theoretical computer science, and in the past decade has seen an explosion of usage. This has been primarily driven by embedding models […]
ASSET Seminar: “Formal Methods for Language Model Systems”
Formal methods are often dismissed as too rigid, complex, or unscalable for frontier language model systems (e.g., LLMs, VLMs, agentic systems). In this talk, I will challenge this assumption with […]
ASSET Seminar: “How can we enable LLM auditing?”
Oversight and auditing of AI systems is becoming increasingly difficult as people use systems in a wide variety of ways, with instructions expressed in natural language prompts. We can no […]
ASSET Seminar: “Towards discrete diffusion models for language and image generation”
We discuss discrete diffusion models that offer a unified framework for jointly modeling categorical data such as text and images. We present a new model that we have developed for […]