Patterns and Middleware for LLM Applications with Kyle Roche

EPISODE 659
LISTEN
twiml-kyle-roche-patterns-and-middleware-for-llm-applications-fhd

Join our list for notifications and early access to events

About this Episode

Today we’re joined by Kyle Roche, founder and CEO of Griptape to discuss patterns and middleware for LLM applications. We dive into the emerging patterns for developing LLM applications, such as off prompt datawhich allows data retrieval without compromising the chain of thought within language modelsand pipelines, which are sequential tasks that are given to LLMs that can involve different models for each task or step in the pipeline. We also explore Griptape, an open-source, Python-based middleware stack that aims to securely connect LLM applications to an organization’s internal and external data systems. We discuss the abstractions it offers, including drivers, memory management, rule sets, DAG-based workflows, and a prompt stack. Additionally, we touch on common customer concerns such as privacy, retraining, and sovereignty issues, and several use cases that leverage role-based retrieval methods to optimize human augmentation tasks.

Connect with Kyle
Read More

Thanks to our sponsor Amazon Web Services

You know AWS as a cloud computing technology leader, but did you realize the company offers a broad array of services and infrastructure at all three layers of the machine learning technology stack? AWS has helped more than 100,000 customers of all sizes and across industries to innovate using ML and AI with industry-leading capabilities and they’re taking the same approach to make it easy, practical, and cost-effective for customers to use generative AI in their businesses. At the bottom layer of the ML stack, they’re making generative AI cost-efficient with Amazon EC2 Inf2 instances powered by AWS Inferentia2 chips. At the middle layer, they’re making generative AI app development easier with Amazon Bedrock, a managed service that makes pre-trained FMs easily accessible via an API. And at the top layer, Amazon CodeWhisperer is generally available now, with support for more than 10 programming languages.

To learn more about AWS ML and AI services, and how they’re helping customers accelerate their machine learning journeys, visit twimlai.com/go/awsml.

Amazon Web Services Logo

Related Episodes

Related Topics

More from TWIML

Leave a Reply

Your email address will not be published. Required fields are marked *