Today we conclude our 2021 ICLR coverage joined by Konstantin Rusch, a PhD Student at ETH Zurich.
Subscribe: iTunes / Google Play / Spotify / RSS
In our conversation with Konstantin, we explore his recent papers, titled coRNN and uniCORNN respectively, which focus on a novel architecture of recurrent neural networks for learning long-time dependencies. We explore the inspiration he drew from neuroscience when tackling this problem, how the performance results compared to networks like LSTMs and others that have been proven to work on this problem, and Konstantin’s future research goals.
Thanks to our Sponsor!
I’d like to send a huge thank you to our friends at Qualcomm Technologies for their continued support of the podcast, and their sponsorship of this ICLR series! Qualcomm AI Research is dedicated to advancing AI to make its core capabilities — perception, reasoning, and action — ubiquitous across devices. Their work makes it possible for billions of users around the world to have AI-enhanced experiences on devices powered by Qualcomm Technologies. To learn more about what Qualcomm Technologies is up to on the research front, visit twimlai.com/qualcomm.
Connect with Konstantin!
- Paper: Coupled Oscillatory Recurrent Neural Network (coRNN): An accurate and (gradient) stable architecture for learning long time dependencies
- Paper: UnICORNN: A recurrent model for learning very long time dependencies
- Paper: Long Short-term Memory