Machine Learning

How Syntiant Is Powering Embodied AI At The Edge with Kurt Busch, CEO of Syntiant



What if your car could hear you through its windows and your glasses only paid attention to your voice in a noisy crowd? We sit down with Syntiant CEO Kurt Bush to explore how embodied AI is quietly moving from data centers into the devices we wear, drive, and use every day.

Kurt shares Syntiant’s origin story and the bet they made in 2017: billions of small processors ship without native AI, and the world needs intelligence that runs right next to sensors. That vision has evolved into a full stack—reliable microphones, ultra-low-power AI silicon, and battle-tested audio and computer vision models that run on their chips and across partner platforms. We dig into why on-device AI beats the cloud for responsiveness, privacy, and cost, and how sensor fusion, beamforming, and AI noise reduction let products understand people in the messiest real-world settings.

From hearing aids that solve the cocktail party problem to autonomous vehicles that “listen” using vibration sensors mounted on glass and panels, we unpack practical use cases that redefine human-machine interfaces. Kurt explains how LLMs are accelerating voice-first experiences, why clean input signals matter more than ever, and how Syntiant’s hardware-agnostic software and always-on wake engines speed up partner deployments. We also talk growth strategy, the economics of adding more AI content to the bill of materials, and the importance of playing well within a broad ecosystem.

If you care about edge AI, sensor innovation, low-power inference, and voice interfaces that actually work in the real world, this conversation maps the path forward. Subscribe, share with a friend who loves practical AI, and leave a review with the next device you want to see go truly on-device.

source

Authorization
*
*
Password generation