In today’s episode we’ll be taking a break from our Strata Data conference series and presenting a special conversation with Jeremy Howard, founder and researcher at Fast.ai.
Subscribe: iTunes / Google Play / Spotify / RSS
Fast.ai is a company many of our listeners are quite familiar with due to their popular deep learning course. This episode is being released today in conjunction with the company’s announcement of version 1.0 of their fastai library at the inaugural Pytorch Devcon in San Francisco. Jeremy and I cover a ton of ground in this conversation. Of course, we dive into the new library and explore why it’s important and what’s changed. We also explore the unique way in which it was developed and what it means for the future of the fast.ai courses. Jeremy shares a ton of great insights and lessons learned in this conversation, not to mention mentions a bunch of really interesting-sounding papers.
Don’t forget to join our community of machine learning enthusiasts, including our study groups for the fast.ai courses, at twimlai.com/meetup.
About Jeremy
Mentioned in the Interview
- Fastai
- Fastai v1 Deep Learning Library
- Pytorch
- J Programming Language
- APL Programming Language
- Practical Deep Learning with Rachel Thomas – TWIML Talk #138
- Pytorch: Fast Differentiable Dynamic Graphs in Python with Soumith Chintala – TWIML Talk #70
- Sylvain Gugger
- Paper: Universal Language Model Fine-tuning for Text Classification
- Paper: Cyclical Learning Rates for Training Neural Networks
- Paper: Super-Convergence: Very Fast Training of Neural Networks Using Large Learning Rates
- TWIML Presents: Series page
- TWIML Events Page
- TWIML Meetup
- TWIML Newsletter
“More On That Later” by Lee Rosevere licensed under CC By 4.0
Jeff Daniels
Finally! I’ve been waiting for this interview! We are huge Jeremy Howard fans.