
- This event has passed.
ESE Ph.D. Thesis Defense: “Neural Compression: Estimating and Achieving the Fundamental Limits”
April 18 at 12:00 PM - 2:00 PM
Neural compression, which pertains to compression schemes that are learned from data using neural networks, has emerged as a powerful approach for compressing real-world data. Neural compressors often outperform classical schemes, especially in settings where reconstructions that are perceptually similar to the source are desired. Despite their empirical success, the fundamental principles governing how neural compressors operate, perform, and trade off performance with complexity are not well-understood compared to classical schemes.
We aim to develop some of the fundamental principles of neural compression. We first introduce neural estimation methods that can estimate the theoretical rate-distortion limits of lossy compression for high dimensional sources using techniques from generative models. These methods illustrate that recent neural compressors are sub-optimal. Next, we build on these insights to discuss neural compressors that approach optimality yet remain low-complexity through the use of lattice coding techniques. These are shown to approach the rate-distortion limits on high-dimensional sources without incurring a significant increase in complexity. Finally, we develop low-complexity compressors for the rate-distortion-perception setting, where an additional perception constraint ensures the source and reconstruction distributions are close in terms of a statistical divergence. These compressors combine lattice coding with the use of shared randomness via dithering over the lattice cells, and provably achieve the fundamental rate-distortion-perception limits on the Gaussian source.

Eric Lei
ESE Ph.D. Candidate
Eric Lei is a final-year Ph.D. student at University of Pennsylvania in the Electrical and Systems Engineering department, advised by Shirin Saeedi Bidokhti and Hamed Hassani, and supported by a NSF Graduate Research Fellowship. His research interests are at the intersection of machine learning and information theory, particularly neural compression and generative models. Previously, he obtained his B.S. from Cornell University.