Neural Ordinary Differential Equations with David Duvenaud
EPISODE 364
|
APRIL
9,
2020
Watch
Follow
Share
About this Episode
Today we're joined by David Duvenaud, Assistant Professor at the University of Toronto.
David, who joined us back on episode #96 back in January ‘18, is back to talk about the various papers that have come out of his lab over the last year and change, focused on Neural Ordinary Differential Equations, a type of continuous-depth neural network. In our conversation, we talk through quite a few of David's papers on the topic, which you can find below. We discuss the problem that David is trying to solve with this research, the potential that ODEs have to replace "the backbone" of the neural networks that are used to train today, and David's approach to engineering.
About the Guest
David Duvenaud
University of Toronto
Resources
- Paper: Neural Ordinary Differential Equations
- Paper: Neural Networks with Cheap Differential Operators
- Paper: FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
- Paper: Latent Ordinary Differential Equations for Irregularly-Sampled Time Series
- Paper: Residual Flows for Invertible Generative Modeling
- Paper: Invertible Residual Networks
- Paper: Efficient Graph Generation With Graph Recurrent Attention Networks
- Paper: Deep Kalman Filters