Join our list for notifications and early access to events
In this episode, we're joined by Sebastian Ruder, a PhD student studying natural language processing at the National University of Ireland and a Research Scientist at text analysis startup Aylien.
In our conversation, Sebastian and I discuss recent milestones in neural NLP, including multi-task learning and pretrained language models. We also discuss the use of attention-based models, Tree RNNs and LSTMs, and memory-based networks. Finally, Sebastian walks us through his recent ULMFit paper, short for "Universal Language Model Fine-tuning for Text Classification," which he co-authored with Jeremy Howard of fast.ai who I interviewed in episode 186.