Today we’re joined Katherine J. Kuchenbecker, director at the Max Planck Institute for Intelligent Systems and of the haptic intelligence department.
Subscribe: iTunes / Google Play / Spotify / RSS
In our conversation, we explore Katherine’s research interests, which lie at the intersection of haptics (physical interaction with the world) and machine learning, introducing us to the concept of “haptic intelligence.” We discuss how ML, mainly computer vision, has been integrated to work together with robots, and some of the devices that Katherine’s lab is developing to take advantage of this research.
We also talk about hugging robots, augmented reality in robotic surgery, and the degree to which she studies human-robot interaction. Finally, Katherine shares with us her passion for mentoring and the importance of diversity and inclusion in robotics and machine learning.
Connect with Katherine!
- Embodied Multimodal Learning Workshop | ICLR 2021
- That’s a VIBE: ML for Human Pose and Shape Estimation with Nikos Athanasiou, Muhammed Kocabas, Michael Black – #409
- Trust in Human-Robot/AI Interactions with Dr. Ayanna Howard – #110
- Human-Robot Interaction and Empathy with Kate Darling – #289
- HuggieBot 1.0: Alexis Block is teaching robots to give good hugs
- Paper: The Six Hug Commandments: Design and Evaluation of a Human-Sized Hugging Robot with Visual and Haptic Perception
- Paper: Robot Interaction Studio: A Platform for Unsupervised HRI
- Paper: Sensor Arrangement for sensing forces and methods for fabricating a sensor arrangement and parts thereof
- Paper: Method for force inference, method for training a feed-forward neural network, force inference module, and sensor arrangement
- Paper: Learning Haptic Adjectives from Tactile Data