Deploying Edge and Embedded AI Systems with Heather Gorr

EPISODE 655
WATCH
Play Video

Join our list for notifications and early access to events

About this Episode

Today we’re joined by Heather Gorr, principal MATLAB product marketing manager at MathWorks. In our conversation with Heather, we discuss the deployment of AI models to hardware devices and embedded AI systems. We explore factors to consider during data preparation, model development, and ultimately deployment, to ensure a successful project. Factors such as device constraints and latency requirements which dictate the amount and frequency of data flowing onto the device are discussed, as are modeling needs such as explainability, robustness and quantization; the use of simulation throughout the modeling process; the need to apply robust verification and validation methodologies to ensure safety and reliability; and the need to adapt and apply MLOps techniques for speed and consistency. Heather also shares noteworthy anecdotes about embedded AI deployments in industries including automotive and oil & gas.

Connect with Heather
Read More

Thanks to our sponsor MathWorks

MathWorks is the leading developer of mathematical computing software. Engineers and scientists worldwide rely on MATLAB® and Simulink® to accelerate the pace of discovery, innovation, development, and learning.

With MATLAB, engineers and scientists can create better AI datasets, develop and operationalize AI solutions, and continuously test AI models in a system-wide context. In just a few lines of MATLAB code, AI can be incorporated into your applications whether you’re designing algorithms, preparing and labeling data, or generating code and deploying to embedded systems.

To learn more about how MATLAB is the enterprise engineering platform for AI, visit mathworks.com/ai.

Mathworks logo

Related Episodes

Related Topics

More from TWIML

Leave a Reply

Your email address will not be published. Required fields are marked *