Transformer-Based Transform Coding with Auke Wiggers

EPISODE 570
|
MAY 2, 2022
Watch
Play
Don't Miss an Episode!  Join our mailing list for episode summaries and other updates.

About this Episode

Today we’re joined by Auke Wiggers, an AI research scientist at Qualcomm. In our conversation with Auke, we discuss his team’s recent research on data compression using generative models. We discuss the relationship between historical compression research and the current trend of neural compression, and the benefit of neural codecs, which learn to compress data from examples. We also explore the performance evaluation process and the recent developments that show that these models can operate in real-time on a mobile device. Finally, we discuss another ICLR paper, “Transformer-based transform coding”, that proposes a vision transformer-based architecture for image and video coding, and some of his team’s other accepted works at the conference.

About the Guest

Connect with Auke

Thanks to our sponsor Qualcomm AI Research

Resources