Learning Long-Term Dependencies with Gradient Descent is Difficult - TWIML Online Meetup #2 - September 2017

Learning Long-Term Dependencies with Gradient Descent is Difficult - TWIML Online Meetup #2 - September 2017

This is a recording of the TWIML Online Meetup group. This month we discuss the paper “Learning Long-Term Dependencies with Gradient Descent is Difficult” by Yoshua Bengio & company, one of the classic papers on Recurrent Neural Networks. Huge thank you to listener Nikola Kučerová for presenting. Make sure you Like this video, and subscribe to our channel!

https://youtu.be/wLXXfBRxUTE

Learning Long-Term Dependencies with Gradient Descent is Difficult, by Yoshua Bengio et al.

To register for the next meetup, visit twimlai.com/meetup