Learning Long-Time Dependencies with RNNs w/ Konstantin Rusch

800 800 The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

Today we conclude our 2021 ICLR coverage joined by Konstantin Rusch, a PhD Student at ETH Zurich.


In our conversation with Konstantin, we explore his recent papers, titled coRNN and uniCORNN respectively, which focus on a novel architecture of recurrent neural networks for learning long-time dependencies. We explore the inspiration he drew from neuroscience when tackling this problem, how the performance results compared to networks like LSTMs and others that have been proven to work on this problem, and Konstantin’s future research goals.

Thanks to our Sponsor!

I’d like to send a huge thank you to our friends at Qualcomm Technologies for their continued support of the podcast, and their sponsorship of this ICLR series! Qualcomm AI Research is dedicated to advancing AI to make its core capabilities — perception, reasoning, and action — ubiquitous across devices. Their work makes it possible for billions of users around the world to have AI-enhanced experiences on devices powered by Qualcomm Technologies. To learn more about what Qualcomm Technologies is up to on the research front, visit twimlai.com/qualcomm.

Connect with Konstantin!

Resources

Join Forces!

Leave a Reply

Your email address will not be published.