Learning Long-Term Dependencies with Gradient Descent is Difficult – TWiML Online Meetup #2 – September 2017

150 150 This Week in Machine Learning & AI

This is a recording of the TWiML Online Meetup group. This month we discuss the paper “Learning Long-Term Dependencies with Gradient Descent is Difficult” by Yoshua Bengio & company, one of the classic papers on Recurrent Neural Networks. Huge thank you to listener Nikola Kučerová for presenting. Make sure you Like this video, and subscribe to our channel!

Learning Long-Term Dependencies with Gradient Descent is Difficult, by Yoshua Bengio et al.

To register for the next meetup, visit twimlai.com/meetup

Leave a Reply

Your email address will not be published.