Join our list for notifications and early access to events
Today we continue our coverage of the AWS ML Summit joined by Chris Fregly, a principal developer advocate at AWS, and Antje Barth, a senior developer advocate at AWS.
In our conversation with Chris and Antje, we explore their roles as community builders prior to, and since, joining AWS, as well as their recently released book Data Science on AWS. In the book, Chris and Antje demonstrate how to reduce cost and improve performance while successfully building and deploying data science projects.
We also discuss the release of their new Practical Data Science Specialization on Coursera, managing the complexity that comes with building real-world projects, and some of their favorite sessions from the recent ML Summit (which you can catch the videos for here).
You know AWS as a cloud computing technology leader, but did you realize the company offers a broad array of services and infrastructure at all three layers of the machine learning technology stack? AWS has helped more than 100,000 customers of all sizes and across industries to innovate using ML and AI with industry-leading capabilities and they’re taking the same approach to make it easy, practical, and cost-effective for customers to use generative AI in their businesses. At the bottom layer of the ML stack, they’re making generative AI cost-efficient with Amazon EC2 Inf2 instances powered by AWS Inferentia2 chips. At the middle layer, they’re making generative AI app development easier with Amazon Bedrock, a managed service that makes pre-trained FMs easily accessible via an API. And at the top layer, Amazon CodeWhisperer is generally available now, with support for more than 10 programming languages.
To learn more about AWS ML and AI services, and how they’re helping customers accelerate their machine learning journeys, visit twimlai.com/go/awsml.