The Fastai v1 Deep Learning Framework with Jeremy Howard
EPISODE 186
|
OCTOBER
2,
2018
Watch
Follow
Share
About this Episode
In today's episode we'll be taking a break from our Strata Data conference series and presenting a special conversation with Jeremy Howard, founder and researcher at Fast.ai.
Fast.ai is a company many of our listeners are quite familiar with due to their popular deep learning course. This episode is being released today in conjunction with the company's announcement of version 1.0 of their fastai library at the inaugural Pytorch Devcon in San Francisco. Jeremy and I cover a ton of ground in this conversation. Of course, we dive into the new library and explore why it's important and what's changed. We also explore the unique way in which it was developed and what it means for the future of the fast.ai courses. Jeremy shares a ton of great insights and lessons learned in this conversation, not to mention mentions a bunch of really interesting-sounding papers.
Don't forget to join our community of machine learning enthusiasts, including our study groups for the fast.ai courses, at twimlai.com/meetup.
About the Guest
Jeremy Howard
Fast.ai
Resources
- Fastai
- Fastai v1 Deep Learning Library
- Pytorch
- J Programming Language
- APL Programming Language
- Practical Deep Learning with Rachel Thomas - TWIML Talk #138
- Pytorch: Fast Differentiable Dynamic Graphs in Python with Soumith Chintala - TWIML Talk #70
- Sylvain Gugger
- Paper: Universal Language Model Fine-tuning for Text Classification
- Paper: Cyclical Learning Rates for Training Neural Networks
- Paper: Super-Convergence: Very Fast Training of Neural Networks Using Large Learning Rates
- TWIML Presents: Series page