Today we’re joined by Jos Van Der Westhuizen, PhD student in Engineering at Cambridge University.
Subscribe: iTunes / Google Play / Spotify / RSS
Jos’ research focuses on applying LSTMs, or Long Short-Term Memory neural networks, to biological data for various tasks. In our conversation, we discuss his paper “The unreasonable effectiveness of the forget gate,” in which he explores the various “gates” that make up an LSTM module and the general impact of getting rid of gates on the computational intensity of training the networks. Jos eventually determines that leaving only the forget-gate results in an unreasonably effective network, and we discuss why. Jos also gives us some great LSTM related resources, including references to Jurgen Schmidhuber, whose research group invented the LSTM, and who I spoke to back in Talk #44.
Join me at Pegaworld!
We’d like to join Pegasystems, who is sponsoring today’s episode, in inviting you to meet meat the MGM Grand in Las Vegas June 2nd-5th at PegaWorld, the company’s annual digital transformation conference.
Pegasystems puts AI at the center of its customer engagement software so that it optimizes every customer touchpoint on every channel in real time. That way each interaction is relevant and timely to each individual customer, no matter if it’s a sales call, a digital marketing campaign or a customer service chat either online or in-store, and the system always learning in real-time to make the next interaction better.
Pegas customers are the real stars at PegaWorld. There you’ll hear great stories of AI applied to the customer experience at real Pega customers. The event is a great way to learn from a who’s who of the Fortune 500, and of course, I’ll be there and speaking as well.
To register, visit pegaworld.com and use the promo code TWIML19 when you sign up for $200 off. Again that’s TWIML19. It’s as easy at that. Hope to see you there!
The O’Reilly Artificial Intelligence Conference is returning to New York in April and we have one FREE conference pass for a lucky listener! To enter:
- First: Go to twimlai.com/ainygiveaway to access the entry form.
- Next: Choose any or all of the nine ways to enter. The more entries you earn, the higher your chances to win!
- For 3 bonus entries, answer the question at the bottom of the entry box!
Mentioned in the Interview
- Paper: The unreasonable effectiveness of the forget gate
- Blog: The unreasonable effectiveness of recurrent neural networks
- The unreasonable effectiveness of mathematics in the natural sciences
- Paper:Long Short-Term Memory
- Paper: Learning to forget: continual prediction with LSTM
- Paper: LSTM: A Search Space Odyssey
- Paper: An Empirical Exploration of Recurrent Network Architectures
- Paper: Can recurrent neural networks warp time?
- Paper: WaveNet: A Generative Model for Raw Audio
- Check out all of our great series from 2018 at the TWIML Presents: Series page!
- TWIML Online Meetup
- Register for the TWIML Newsletter
“More On That Later” by Lee Rosevere licensed under CC By 4.0