Milestones in Neural Natural Language Processing with Sebastian Ruder

EPISODE 195
|
OCTOBER 29, 2018
Watch
Banner Image: Sebastian Ruder - Podcast Interview
Don't Miss an Episode!  Join our mailing list for episode summaries and other updates.

About this Episode

In this episode, we're joined by Sebastian Ruder, a PhD student studying natural language processing at the National University of Ireland and a Research Scientist at text analysis startup Aylien. In our conversation, Sebastian and I discuss recent milestones in neural NLP, including multi-task learning and pretrained language models. We also discuss the use of attention-based models, Tree RNNs and LSTMs, and memory-based networks. Finally, Sebastian walks us through his recent ULMFit paper, short for "Universal Language Model Fine-tuning for Text Classification," which he co-authored with Jeremy Howard of fast.ai who I interviewed in episode 186.

About the Guest

Connect with Sebastian

Resources

Related Topics