Open Source Generative AI at Hugging Face with Jeff Boudier

EPISODE 624
WATCH
Play Video

Join our list for notifications and early access to events

About this Episode

Today we’re joined by Jeff Boudier, head of product at Hugging Face 🤗. In our conversation with Jeff, we explore the current landscape of open-source machine learning tools and models, the recent shift towards consumer-focused releases, and the importance of making ML tools accessible. We also discuss the growth of the Hugging Face Hub, which currently hosts over 150k models, and how formalizing their collaboration with AWS will help drive the adoption of open-source models in the enterprise.

Connect with Jeff
Read More

Thanks to our sponsor Amazon Web Services

You know AWS as a cloud computing technology leader, but did you realize the company offers a broad array of services and infrastructure at all three layers of the machine learning technology stack? AWS has helped more than 100,000 customers of all sizes and across industries to innovate using ML and AI with industry-leading capabilities and they’re taking the same approach to make it easy, practical, and cost-effective for customers to use generative AI in their businesses. At the bottom layer of the ML stack, they’re making generative AI cost-efficient with Amazon EC2 Inf2 instances powered by AWS Inferentia2 chips. At the middle layer, they’re making generative AI app development easier with Amazon Bedrock, a managed service that makes pre-trained FMs easily accessible via an API. And at the top layer, Amazon CodeWhisperer is generally available now, with support for more than 10 programming languages.

To learn more about AWS ML and AI services, and how they’re helping customers accelerate their machine learning journeys, visit twimlai.com/go/awsml.

Amazon Web Services Logo

Related Episodes

Related Topics

More from TWIML

Leave a Reply

Your email address will not be published. Required fields are marked *