This is a recording of the TWIML Online Meetup group. This month we discuss the paper “Learning Long-Term Dependencies with Gradient Descent is Difficult” by Yoshua Bengio & company, one of the classic papers on Recurrent Neural Networks. Huge thank you to listener Nikola Kučerová for presenting. Make sure you Like this video, and subscribe to our channel!

Learning Long-Term Dependencies with Gradient Descent is Difficult, by Yoshua Bengio et al.

To register for the next meetup, visit twimlai.com/meetup