Today we’re joined by Andreas Madsen, an independent researcher based in Denmark whose research focuses on developing interpretable machine learning models.
Subscribe: iTunes / Google Play / Spotify / RSS
While we caught up with Andreas to discuss his ICLR spotlight paper, “Neural Arithmetic Units,” we also spend time exploring his experience as an independent researcher. We discuss the difficulties of working with limited resources, the importance of finding peers to collaborate with, and tempering expectations of getting papers accepted to conferences — something that might take a few tries to get right. In his paper, Andreas notes that Neural Networks struggle to perform exact arithmetic operations over real numbers, but this can be helped with the addition of two NN components: the Neural Addition Unit (NAU), which can learn exact addition and subtraction; and the Neural Multiplication Unit (NMU) that can multiply subsets of a vector.
Connect with Andreas!
- Paper: Neural Arithmetic Logic Units
- Andrew Trask
- Paper: Neural GPUs Learn Architecture
- Neural Arithmetic Units: Paper – Code
- Book: Programming Collective Intelligence
- Blog: Becoming an Independent Researcher and getting published in ICLR with spotlight
- Join the TWIML Community!
- Check out our TWIML Presents: series page!
- Register for the TWIML Newsletter
- Check out the official TWIMLcon:AI Platform video packages here!
- Download our latest eBook, The Definitive Guide to AI Platforms!
“More On That Later” by Lee Rosevere licensed under CC By 4.0