Building LLM-Based Applications with Azure OpenAI with Jay Emery

EPISODE 657
WATCH
Play Video

Join our list for notifications and early access to events

About this Episode

Today we’re joined by Jay Emery, director of technical sales & architecture at Microsoft Azure. In our conversation with Jay, we discuss the challenges faced by organizations when building LLM-based applications, and we explore some of the techniques they are using to overcome them. We dive into the concerns around security, data privacy, cost management, and performance as well as the ability and effectiveness of prompting to achieve the desired results versus fine-tuning, and when each approach should be applied. We cover methods such as prompt tuning and prompt chaining, prompt variance, fine-tuning, and RAG to enhance LLM output along with ways to speed up inference performance such as choosing the right model, parallelization, and provisioned throughput units (PTUs). In addition to that, Jay also shared several intriguing use cases describing how businesses use tools like Azure Machine Learning prompt flow and Azure ML AI Studio to tailor LLMs to their unique needs and processes.

Connect with Jay
Read More

Thanks to our sponsor Microsoft

I’d like to send a huge thanks to our friends at Microsoft for their support of the podcast and their sponsorship of today’s episode. Microsoft is your gateway to the future through cutting-edge AI technology! From virtual assistants to groundbreaking machine learning, Microsoft is a leader in AI innovation. The company is committed to ensuring responsible AI use, empowering people worldwide, including startups and digital natives, with intelligent technology to tackle societal challenges in sustainability, accessibility, and humanitarian action. Microsoft technologies empower you, your startup, and digital community to achieve more and innovate boundlessly. Explore the possibilities by visiting Microsoft.ai

Microsoft Logo

Related Episodes

Related Topics

More from TWIML

Leave a Reply

Your email address will not be published. Required fields are marked *