tinyML Summit 2023: Designing Multi-Model Smart Human Machine Interfaces with Microcontrollers
Designing Multi-Model Smart Human Machine Interfaces with Microcontrollers
Sriram KALLURI, Product Manager, NXP Semiconductors
Explore the development of a solution kit for next-generation smart human-machine interfaces (HMI) require that enable multi-modal, intelligent, hands-free capabilities including machine learning (ML), vision for face and gesture recognition, far-field voice control and 2D graphical user interface (GUI) into an overall system design using just a single high-performance MCU.
The machine learning application of face and gesture recognition and voice control are performed entirely on the edge device and eliminates the need for the cloud as well as enhancing privacy and latency concerns that come with it. The audience will learn about the design decisions in bringing a development kit with a variety of features to help minimize time to market, risk, and development effort, including fully-integrated turnkey software, hardware reference designs balanced with a software framework that gives designers the flexibility to customize vision, voice functions and a combination of these features.
source