Today we’re joined by friend of the show Jeff Gehlhaar, VP of technology and the head of AI software platforms at Qualcomm.
Subscribe: iTunes / Google Play / Spotify / RSS
In our conversation with Jeff, we cover a ton of ground, starting with a bit of exploration around ML compilers, what they are, and their role in solving issues of parallelism. We also dig into the latest additions to the Snapdragon platform, AI Engine Direct and how it works as a bridge to bring more capabilities across their platform, how benchmarking works in the context of the platform, how the work of other researchers we’ve spoken to on compression and quantization finds its way from research to product, and much more!
After you check out this interview, you can look below for some of the other conversations with researchers mentioned.
Thanks to our Sponsor!
I’d like to send a huge thank you to our friends at Qualcomm Technologies for their continued support of the podcast, and their sponsorship of this series of podcasts from the CVPR conference! Qualcomm AI Research is dedicated to advancing AI to make its core capabilities — perception, reasoning, and action — ubiquitous across devices. Their work makes it possible for billions of users around the world to have AI-enhanced experiences on devices powered by Qualcomm Technologies. To learn more about what Qualcomm Technologies is up to on the research front, visit twimlai.com/qualcomm.
Connect with Jeff!
- Sep ‘20 – Open Source at Qualcomm AI Research w/ Jeff G. and Zahra K. – #414
- Dec ‘21 – Natural Graph Networks with Taco Cohen – Episode #440
- May ‘21 – Probabilistic Numeric CNNs with Roberto Bondesan – #482
- June ‘21 – Accelerating Distributed AI Applications at Qualcomm w/Ziad Asghar – #489
- June ‘21 – Skip-Convolutions for Efficient Video Processing with Amir Habibian – #496
- Benchmarking Machine Learning with MLCommons w/ Peter Mattson – #434