Designing Better Sequence Models with RNNs with Adji Bousso Dieng

EPISODE 160
LISTEN
Banner Image: Adji Bousso Dieng - Podcast Interview

Join our list for notifications and early access to events

About this Episode

In this episode, i'm joined by Adji Bousso Dieng, PhD Student in the Department of Statistics at Columbia University.

In this interview, Adji and I discuss two of her recent papers, the first, an accepted paper from this year's ICML conference titled "Noisin: Unbiased Regularization for Recurrent Neural Networks," which, as the name implies, presents a new way to regularize RNNs using noise injection. The second paper, an ICLR submission from last year titled "TopicRNN: A Recurrent Neural Network with Long-Range Semantic Dependency," debuts an RNN-based language model designed to capture the global semantic meaning relating words in a document via latent topics. We dive into the details behind both of these papers and I learn a ton along the way.

Connect with Adji Bousso
Read More

More from TWIML

Leave a Reply

Your email address will not be published. Required fields are marked *