Expectation Maximization, Gaussian Mixtures & Belief Propagation, OH MY! with Inmar Givoni

800 800 This Week in Machine Learning & AI

In this episode i’m joined by Inmar Givoni, Autonomy Engineering Manager at Uber ATG, to discuss her work on the paper Min-Max Propagation, which was presented at NIPS last month in Long Beach.

Inmar and I get into a meaty discussion about graphical models, including what they are and how they’re used, some of the challenges they present for both training and inference, and how and where they can be best applied. Then we jump into an in-depth look at the key ideas behind the Min-Max Propagation paper itself, including the relationship to the broader domain of belief propagation and ideas like affinity propagation, and how all these can be applied to a use case example like the makespan problem. This was a really fun conversation! Enjoy!

Be sure to check out some of the great names that will be at the AI Conference in New York, Apr 29–May 2, where you’ll join the leading minds in AI, Peter Norvig, George Church, Olga Russakovsky, Manuela Veloso, and Zoubin Ghahramani. Explore AI’s latest developments, separate what’s hype and what’s really game-changing, and learn how to apply AI in your organization right now. Save 20% on most passes with discount code PCTWIML. Early price ends February 2!

About Inmar

Mentioned in the Interview

Leave a Reply

Your email address will not be published.