In this episode, i’m joined by Adji Bousso Dieng, PhD Student in the Department of Statistics at Columbia University.
Subscribe: iTunes / Google Play / Spotify / RSS
In this interview, Adji and I discuss two of her recent papers, the first, an accepted paper from this year’s ICML conference titled “Noisin: Unbiased Regularization for Recurrent Neural Networks,” which, as the name implies, presents a new way to regularize RNNs using noise injection. The second paper, an ICLR submission from last year titled “TopicRNN: A Recurrent Neural Network with Long-Range Semantic Dependency,” debuts an RNN-based language model designed to capture the global semantic meaning relating words in a document via latent topics. We dive into the details behind both of these papers and I learn a ton along the way.
On July 17th at 5pm PT, Nic Teague will lead a discussion on the paper Quantum Machine Learning by Jacob Biamonte et al, which explores how to devise and implement concrete quantum software for accomplishing machine learning tasks. If you haven’t joined our meetup yet, visit twimlai.com/meetup to do so.
Be sure to sign up for our weekly newsletter. We recently shared a write up detailing the ML/AI Job Board we’re working on, and got a ton of encouragement and interest. To make sure you don’t miss anything, head over to twimlai.com/newsletter to sign up.
Mentioned in the Interview
- Paper: Noisin: Unbiased Regularization for Recurrent Neural Networks
- Paper: TopicRNN: A Recurrent Neural Network with Long-Range Semantic Dependency
- TWiML Presents: Series page
- TWiML Events Page
- TWiML Meetup
- TWiML Newsletter
“More On That Later” by Lee Rosevere licensed under CC By 4.0