In this episode of our Deep Learning Indaba series, we’re joined by Naila Murray, Senior Research Scientist and Group Lead in the computer vision group at Naver Labs Europe.
Subscribe: iTunes / Google Play / Spotify / RSS
Naila presented at the Indaba on computer vision. In this discussion, we explore her work on visual attention, including why visual attention is important and the trajectory of work in the field over time. We also discuss her paper “Generalized Max Pooling,” and her recent research interest in learning representations with deep learning.
Thanks to our Sponsor!
I’d like to send a big shout out to our friends at Google AI for their support of the podcast and their sponsorship of this series. In this podcast you heard Sara talk about the AI Residency program she’s in at Google. Well, just yesterday they opened up applications for the 2019 program! The Google AI Residency is a one-year machine learning research training program with the goal of helping individuals become successful machine learning researchers. The program seeks Residents from a very diverse set of educational and professional backgrounds from all over the world, so if you think this is something that interests you, you should definitely apply! Find out more about the program at g.co/airesidency.
Mentioned in the Interview
- Deep Learning Indaba
- Paper: Color image perception
- Paper: Generalized max pooling
- Paper: Feature-Integration Theory in Attention – A. Triesman
- Deep Learning Indaba Series Page
- TWIML Presents: Series page
- TWIML Events Page
- TWIML Meetup
- TWIML Newsletter
“More On That Later” by Lee Rosevere licensed under CC By 4.0