Learning Long-Term Dependencies with Gradient Descent is Difficult – TWiML Online Meetup #2 – September 2017https://twimlai.com/wp-content/themes/osmosis/images/empty/thumbnail.jpg 150 150 This Week in Machine Learning & AI This Week in Machine Learning & AI https://twimlai.com/wp-content/themes/osmosis/images/empty/thumbnail.jpg
This is a recording of the TWiML Online Meetup group. This month we discuss the paper “Learning Long-Term Dependencies with Gradient Descent is Difficult” by Yoshua Bengio & company, one of the classic papers on Recurrent Neural Networks. Huge thank you to listener Nikola Kučerová for presenting. Make sure you Like this video, and subscribe to our channel!
To register for the next meetup, visit twimlai.com/meetup