Simplifying On-Device AI for Developers with Siddhika Nevrekar

EPISODE 697
WATCH
Play Video

Join our list for notifications and early access to events

About this Episode

Today, we're joined by Siddhika Nevrekar, AI Hub head at Qualcomm Technologies, to discuss on-device AI and how to make it easier for developers to take advantage of device capabilities. We unpack the motivations for AI engineers to move model inference from the cloud to local devices, and explore the challenges associated with on-device AI. We dig into the role of hardware solutions, from powerful system-on-chips (SoC) to neural processors, the importance of collaboration between community runtimes like ONNX and TFLite and chip manufacturers, the unique challenges of IoT and autonomous vehicles, and the key metrics developers should focus on to ensure optimal on-device performance. Finally, Siddhika introduces Qualcomm's AI Hub, a platform developed to simplify the process of testing and optimizing AI models across different devices.

Connect with Siddhika
Read More

Thanks to our sponsor Qualcomm Technologies

I’d like to send a huge thanks to our friends at Qualcomm for their support of the podcast and their sponsorship of today’s episode. Qualcomm® AI Hub simplifies deploying AI models for vision, audio, and speech applications to edge devices. You can optimize, validate, and deploy your own AI models on hosted Qualcomm platform devices within minutes. To learn more about how Qualcomm AI Hub is enhancing the AI developer experience on mobile and edge devices, visit: https://aihub.qualcomm.com

Qualcomm Conference Logo

More from TWIML

Leave a Reply

Your email address will not be published. Required fields are marked *