Block Sparse Kernels for Deep Neural Networks with Durk Kingma

EPISODE 80
LISTEN
Banner Image: Durk Kingma - Podcast Interview
Join our list for notifications and early access to events

About this Episode

The show is part of a series that I'm really excited about, in part because I've been working to bring them to you for quite a while now. The focus of the series is a sampling of the interesting work being done over at OpenAI, the independent AI research lab founded by Elon Musk, Sam Altman and others. This episode features Durk Kingma, a Research Scientist at OpenAI. Although Durk is probably best known for his pioneering work on variational autoencoders, he joined me this time to talk through his latest project on block sparse kernels, which OpenAI just published this week.

Block-sparsity is a property of certain neural network representations, and OpenAI's work on developing block-sparse kernels helps make it more computationally efficient to take advantage of them. In addition to covering block sparse kernels themselves and the background required to understand them, we also discuss why they're important and walk through some examples of how they can be used. I'm happy to present another fine Nerd Alert show to close out this OpenAI Series, and I know you'll enjoy it!.

Connect with Durk
Read More

Thanks to our sponsor NVIDIA

NVIDIA’s invention of the GPU in 1999 sparked the growth of the PC gaming market and has redefined modern computer graphics, high performance computing and artificial intelligence. The company’s pioneering work in accelerated computing and AI is reshaping trillion-dollar industries, such as transportation, healthcare and manufacturing, and fueling the growth of many others.
NVIDIA Logo

Related Episodes

Related Topics

More from TWIML

Leave a Reply

Your email address will not be published. Required fields are marked *