Channel Gating for Cheaper and More Accurate Neural Nets with Babak Ehteshami Bejnordi
EPISODE 385
|
JUNE
22,
2020
Watch
Follow
Share
About this Episode
Today we're joined by Babak Ehteshami Bejnordi, a Research Scientist at Qualcomm.
Babak works closely with former guest Max Welling and is currently focused on conditional computation, which is the main driver for today's conversation. We dig into a few papers in great detail including one from this year's CVPR conference, Conditional Channel Gated Networks for Task-Aware Continual Learning. We also discuss the paper TimeGate: Conditional Gating of Segments in Long-range Activities, and another paper from this year's ICLR conference, Batch-Shaping for Learning Conditional Channel Gated Networks. We cover how gates are used to drive efficiency and accuracy, while decreasing model size, how this research manifests into actual products and more!
About the Guest
Babak Ehteshami Bejnordi
Qualcomm
Resources
- Paper: Batch-Shaping for Learning Conditional Channel Gated Networks
- Paper: Conditional Channel Gated Networks for Task-Aware Continual Learning
- Paper: TimeGate: Conditional Gating of Segments in Long-range Activities
- Camelyon Challenge
- Paper: Adaptive Mixtures of Local Experts
- Enabling Continual Learning in Neural Networks
