Milestones in Neural Natural Language Processing with Sebastian Ruder

800 800 This Week in Machine Learning & AI

In this episode, we’re joined by Sebastian Ruder, a PhD student studying natural language processing at the National University of Ireland and a Research Scientist at text analysis startup Aylien.

In our conversation, Sebastian and I discuss recent milestones in neural NLP, including multi-task learning and pretrained language models. We also discuss the use of attention-based models, Tree RNNs and LSTMs, and memory-based networks. Finally, Sebastian walks us through his recent ULMFit paper, short for “Universal Language Model Fine-tuning for Text Classification,” which he co-authored with Jeremy Howard of fast.ai who I interviewed in episode 186.

Thanks to our Sponsor!


I’d like to send a huge thanks to our friends at IBM for their sponsorship of this episode. Are you interested in exploring code patterns leveraging multiple technologies, including ML and AI? Then check out IBM Developer. With more than 100 open source programs, a library of knowledge resources, developer advocates ready to help, and a global community of developers, what in the world will you create? Dive in at ibm.biz/mlaipodcast, and be sure to let them know that TWiML sent you!

In addition, check out IBM’s NLC offering, which is particularly relevant to this episode!

About Sebastian

Mentioned in the Interview

“More On That Later” by Lee Rosevere licensed under CC By 4.0

1 comment

Leave a Reply

Your email address will not be published.