Designing Better Sequence Models with RNNs with Adji Bousso Dieng
EPISODE 160
|
JULY
2,
2018
Watch
Follow
Share
About this Episode
In this episode, i'm joined by Adji Bousso Dieng, PhD Student in the Department of Statistics at Columbia University.
In this interview, Adji and I discuss two of her recent papers, the first, an accepted paper from this year's ICML conference titled "Noisin: Unbiased Regularization for Recurrent Neural Networks," which, as the name implies, presents a new way to regularize RNNs using noise injection. The second paper, an ICLR submission from last year titled "TopicRNN: A Recurrent Neural Network with Long-Range Semantic Dependency," debuts an RNN-based language model designed to capture the global semantic meaning relating words in a document via latent topics. We dive into the details behind both of these papers and I learn a ton along the way.
About the Guest
Adji Bousso Dieng
The Africa I Know