Milestones in Neural Natural Language Processing with Sebastian Ruder
EPISODE 195
|
OCTOBER
29,
2018
Watch
Follow
Share
About this Episode
In this episode, we're joined by Sebastian Ruder, a PhD student studying natural language processing at the National University of Ireland and a Research Scientist at text analysis startup Aylien.
In our conversation, Sebastian and I discuss recent milestones in neural NLP, including multi-task learning and pretrained language models. We also discuss the use of attention-based models, Tree RNNs and LSTMs, and memory-based networks. Finally, Sebastian walks us through his recent ULMFit paper, short for "Universal Language Model Fine-tuning for Text Classification," which he co-authored with Jeremy Howard of fast.ai who I interviewed in episode 186.
About the Guest
Sebastian Ruder
DeepMind
Resources
- Blog Post: A Review of the Recent History of NLP
- Universal Language Model Fine-tuning for Text Classification
- Acoustic Word Embeddings for Low Resource Speech Processing with Herman Kamper
- Paper: A Unified Architecture for Natural Language Processing: Deep Neural Networks with Multitask Learning
- Distributed Representations of Words and Phrases and their Compositionality (Word2vec)
- TWIML Talk #48 - Word2Vec & Friends with Bruno Goncalves
- Wavenet
- Neural Turing Machines
- TWIML Talk #186 - The Fastai v1 Deep Learning Framework with Jeremy Howard
- N-gram
- Deep Learning Indaba Series
- TWIML Presents: Series page
