Learning to Ponder: Memory in Deep Neural Networks with Andrea Banino
EPISODE 528
|
OCTOBER
18,
2021
Watch
Follow
Share
About this Episode
Today we're joined by Andrea Banino, a research scientist at DeepMind. In our conversation with Andrea, we explore his interest in artificial general intelligence by way of episodic memory, the relationship between memory and intelligence, the challenges of applying memory in the context of neural networks, and how to overcome problems of generalization.
We also discuss his work on the PonderNet, a neural network that "budgets" its computational investment in solving a problem, according to the inherent complexity of the problem, the impetus and goals of this research, and how PonderNet connects to his memory research.
About the Guest
Andrea Banino
DeepMind
Resources
- Paper: Vector-based navigation using grid-like representations in artificial agents
- Paper: Big-Loop Recurrence within the Hippocampal System Supports Integration of Information across Episodes
- Paper: Memo: A deep network for flexible combination of episodic memories
- Paper: CoBERL: Contrastive BERT for Reinforcement Learning
- Paper: Hybrid computing using a neural network with dynamic external memory Paper: Memory Networks
- Paper: Direct Fit to Nature: An Evolutionary Perspective on Biological and Artificial Neural Networks
- Karpathy's Tweet

