Channel Gating for Cheaper and More Accurate Neural Nets with Babak Ehteshami Bejnordi

800 800 The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

Today we’re joined by Babak Ehteshami Bejnordi, a Research Scientist at Qualcomm.

Babak works closely with former guest Max Welling and is currently focused on conditional computation, which is the main driver for today’s conversation. We dig into a few papers in great detail including one from this year’s CVPR conference, Conditional Channel Gated Networks for Task-Aware Continual Learning. We also discuss the paper TimeGate: Conditional Gating of Segments in Long-range Activities, and another paper from this year’s ICLR conference, Batch-Shaping for Learning Conditional Channel Gated Networks. We cover how gates are used to drive efficiency and accuracy, while decreasing model size, how this research manifests into actual products and more!

Thanks to our sponsor

I’d like to send a huge thank you to our friends at Qualcomm for their support of the podcast, and their sponsorship of this series! Qualcomm AI Research is dedicated to advancing AI to make its core capabilities — perception, reasoning, and action — ubiquitous across devices. Their work makes it possible for billions of users around the world to have AI-enhanced experiences on Qualcomm Technologies-powered devices. To learn more about what Qualcomm is up to on the research front, visit here.

Connect with Babak!

Resources

Join Forces!

“More On That Later” by Lee Rosevere licensed under CC By 4.0

Leave a Reply

Your email address will not be published.