In this episode of our AI Rewind series, we’re bringing back one of your favorite guests of the year, Jeremy Howard, founder and researcher at Fast.ai.
Subscribe: iTunes / Google Play / Spotify / RSS
Jeremy joins us to discuss trends in Deep Learning in 2018 and beyond. We cover many of the papers, tools and techniques that have contributed to making deep learning more accessible than ever to so many developers and data scientists.
About Jeremy
Mentioned in the Interview
- DAWNBench
- Paper: Super-Convergence: Very Fast Training of Neural Networks Using Large Learning Rates
- Paper: Fixing Weight Decay Regularization in Adam
- Jason Antic – DeOldify
- Paper: Improved Regularization of Convolutional Neural Networks with Cutout
- Paper: Addressing Function Approximation Error in Actor-Critic Methods
- Judy Gichoya
- Pytorch 1.0
- Fast AI 1.0
- Pytorch “Just in Time” Library
- A Highly Efficient and Modular Implementation of Gaussian Processes in PyTorch
- TVM Deep Learning Compiler
- Swift
- F#
- Julia
- Platform.ai
- Check out all of our great series from 2018 at the TWIML Presents: Series page!
- TWIML Online Meetup
- Register for the TWIML Newsletter
“More On That Later” by Lee Rosevere licensed under CC By 4.0
matthew corkum
Great podcast…always a pleasure to hear from Jeremy Howard on the topic of Deep Learning. He is very humble, honest, and hands on with what works to make more resilient solutions.