Loading Events

« All Events

  • This event has passed.

ASSET Seminar: “Some Displaced Vignettes on Generalized Notions of Equivariance”

October 16 at 12:00 PM - 1:15 PM

Abstract:

The explicit incorporation of task-specific inductive biases through symmetry has emerged as a crucial design precept in the development of high-performance machine learning models. Symmetry-aware neural networks, such as group equivariant networks, have achieved notable success in areas like protein and drug design, where capturing task-specific symmetries improves generalization. Recent efforts have focused on models that relax equivariance, balancing flexibility and equivariance to enhance performance. In the first part of the talk, I will discuss the benefits of partial and approximate equivariance from a theoretical perspective, presenting quantitative bounds that demonstrate how models capturing task-specific symmetries lead to improved generalization. Utilizing this quantification, I will examine the more general question of dealing with approximate/partial symmetries and model mis-spefication, delineating conditions under which the model equivariance is optimal for a given level of data symmetry. In the second part, I will present a general formalism based on special structured matrices, which generalizes classical low-displacement rank theory of Kailath and co-workers, which can help in constructing approximately equivariant neural networks with significantly reduced parameter counts. In the last part, I will discuss some attempts at generalizing notions of equivariance in the context of language and compositional generalization. I will also talk about some ongoing work on using such notions for the problem of inverse protein folding.

Work done in collaboration with: Mircea Petrace (Pontificia Universidad Católica de Chile), Ashwin Samudre (Simon Fraser University), Brian D. Nord (Fermilab and University of Chicago), and Payel Das (IBM Research).

Zoom Link (if unable to attend in-person): https://upenn.zoom.us/j/96014696752

Shubhendu Trivedi

Researcher

Shubhendu Trivedi is a researcher working in machine learning and computational & applied mathematics, with a focus on symmetry-based learning, uncertainty quantification, reliable deployment, and machine learning applications in physics and chemistry. He is currently engaged in a simulation-based inference project for telescope automation at Fermilab. Previously, Shubhendu held roles as a research associate and research affiliate at MIT CSAIL for several years, where he was also part of the MIT Consortium on Machine Learning for Pharmaceutical Discovery and Synthesis (MLPDS). Between 2018 and 2019, he was also the Institute Fellow in Mathematics at Brown University’s ICERM, in the programs on non-linear algebra and algebraic vision. He earned his PhD in 2018 working at the University of Chicago and the Toyota Technological Institute, where he worked on symmetry-based neural networks and metric estimation. He also holds a master’s degree specializing in computer vision, another master’s degree working on educational analytics, a bachelor’s in electronics engineering, and a diploma in wireless network design. Shubhendu’s research experience is complemented by extensive client-facing applied data science experience and first-hand real-world deployment/engineering experience, having worked with organizations like ZS Associates, United Airlines, Kraft amongst others. In addition to his academic and research work, he has co-founded a startup in semiconductors and serves on the boards of multiple startups, the most recent of which include Reexpress AI (LLMs), Brainwell Health (Imaging), and Spark Neuro (EEG).

Details

Date:
October 16
Time:
12:00 PM - 1:15 PM
Event Tags:
, ,

Venue

Raisler Lounge (Room 225), Towne Building
220 South 33rd Street
Philadelphia, PA 19104 United States
+ Google Map
View Venue Website