Today we’re joined by David Duvenaud, Assistant Professor at the University of Toronto.
Subscribe: iTunes / Google Play / Spotify / RSS
David, who joined us back on episode #96 back in January ‘18, is back to talk about the various papers that have come out of his lab over the last year and change, focused on Neural Ordinary Differential Equations, a type of continuous-depth neural network. In our conversation, we talk through quite a few of David’s papers on the topic, which you can find below. We discuss the problem that David is trying to solve with this research, the potential that ODEs have to replace “the backbone” of the neural networks that are used to train today, and David’s approach to engineering.
Connect with David!
- Paper: Neural Ordinary Differential Equations
- Paper: Neural Networks with Cheap Differential Operators
- Paper: FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
- Paper: Latent Ordinary Differential Equations for Irregularly-Sampled Time Series
- Paper: Residual Flows for Invertible Generative Modeling
- Paper: Invertible Residual Networks
- Paper: Efficient Graph Generation With Graph Recurrent Attention Networks
- Paper: Deep Kalman Filters
- Join the TWIML Community!
- Check out our TWIML Presents: series page!
- Register for the TWIML Newsletter
- Check out the official TWIMLcon:AI Platform video packages here!
- Download our latest eBook, The Definitive Guide to AI Platforms!
“More On That Later” by Lee Rosevere licensed under CC By 4.0