Responsible AI in the Generative Era with Michael Kearns
About this Episode
About the Guest
Michael Kearns
University of Pennsylvania, Amazon
Thanks to our sponsor Amazon Web Services
You know AWS as a cloud computing technology leader, but did you realize the company offers a broad array of services and infrastructure at all three layers of the machine learning technology stack? AWS has helped more than 100,000 customers of all sizes and across industries to innovate using ML and AI with industry-leading capabilities and they’re taking the same approach to make it easy, practical, and cost-effective for customers to use generative AI in their businesses. At the bottom layer of the ML stack, they’re making generative AI cost-efficient with Amazon EC2 Inf2 instances powered by AWS Inferentia2 chips. At the middle layer, they’re making generative AI app development easier with Amazon Bedrock, a managed service that makes pre-trained FMs easily accessible via an API. And at the top layer, Amazon CodeWhisperer is generally available now, with support for more than 10 programming languages.
To learn more about AWS ML and AI services, and how they’re helping customers accelerate their machine learning journeys, visit twimlai.com/go/awsml.
Resources
- Blog: Announcing new tools and capabilities to enable responsible AI innovation
- Survey: New survey findings show that businesses plan to increase investments in responsible AI in 2024
- AWS Responsible AI
- Blog: Responsible AI in the wild: Lessons learned at AWS
- AWS re:Invent 2023
- Service Cards and ML Governance with Michael Kearns - #610
- Edutainment for AI and AWS PartyRock with Mike Miller - #661
- Data, Systems and ML for Visual Understanding with Cody Coleman - #660
- Patterns and Middleware for LLM Applications with Kyle Roche - #659
- The Enterprise LLM Landscape with Atul Deo - #640
- Open Source Generative AI at Hugging Face with Jeff Boudier - #624

