Milestones in Neural Natural Language Processing with Sebastian Ruder

EPISODE 195
LISTEN
Banner Image: Sebastian Ruder - Podcast Interview

Join our list for notifications and early access to events

About this Episode

In this episode, we're joined by Sebastian Ruder, a PhD student studying natural language processing at the National University of Ireland and a Research Scientist at text analysis startup Aylien.

In our conversation, Sebastian and I discuss recent milestones in neural NLP, including multi-task learning and pretrained language models. We also discuss the use of attention-based models, Tree RNNs and LSTMs, and memory-based networks. Finally, Sebastian walks us through his recent ULMFit paper, short for "Universal Language Model Fine-tuning for Text Classification," which he co-authored with Jeremy Howard of fast.ai who I interviewed in episode 186.

Connect with Sebastian
Read More

Related Episodes

Related Topics

More from TWIML

Leave a Reply

Your email address will not be published. Required fields are marked *