Learning Long-Time Dependencies with RNNs with Thorben Konstantin Rusch
EPISODE 484
|
MAY
17,
2021
Watch
Follow
Share
About this Episode
Today we conclude our 2021 ICLR coverage joined by Konstantin Rusch, a PhD Student at ETH Zurich.
In our conversation with Konstantin, we explore his recent papers, titled coRNN and uniCORNN respectively, which focus on a novel architecture of recurrent neural networks for learning long-time dependencies. We explore the inspiration he drew from neuroscience when tackling this problem, how the performance results compared to networks like LSTMs and others that have been proven to work on this problem, and Konstantin's future research goals.
About the Guest
Thorben Konstantin Rusch
ETH Zürich
Thanks to our sponsor Qualcomm AI Research
Qualcomm AI Research is dedicated to advancing AI to make its core capabilities — perception, reasoning, and action — ubiquitous across devices. Their work makes it possible for billions of users around the world to have AI-enhanced experiences on devices powered by Qualcomm Technologies. To learn more about what Qualcomm Technologies is up to on the research front, visit twimlai.com/qualcomm.