Join our list for notifications and early access to events
In this episode I'm joined by Amir Zamir, Postdoctoral researcher at both Stanford & UC Berkeley.
Amir joins us fresh off of winning the 2018 CVPR Best Paper Award for co-authoring "Taskonomy: Disentangling Task Transfer Learning." In this work, Amir and his coauthors explore the relationships between different types of visual tasks and use this structure to better understand the types of transfer learning that will be most effective for each, resulting in what they call a "computational taxonomic map for task transfer learning."
In our conversation, we discuss the nature and consequences of the relationships that Amir and his team discovered, and how they can be used to build more effective visual systems with machine learning. Along the way Amir provides a ton of great examples and explains the various tools his team has created to illustrate these concepts.