tinyML EMEA – Christoph Posch: Event sensors for embedded edge AI vision applications



Event sensors for embedded edge AI vision applications
Christoph POSCH, CTO, PROPHESEE

Event-based vision is a term naming an emerging paradigm of acquisition and processing of visual information for numerous artificial vision applications in industrial, surveillance, IoT, AR/VR, automotive and more. The highly efficient way of acquiring sparse data and the robustness to uncontrolled lighting conditions are characteristics of the event sensing process that make event-based vision attractive for at-the-edge visual perception systems that are able to cope with limited resources and a high degree of autonomy.
However, the unconventional format of the event data, non-constant data rates, non-standard interfaces, and, in general, the way, dynamic visual information is encoded inside the data, pose challenges to the usage and integration of event sensors in an embedded vision system.
Prophesee has recently developed the first of a new generation of event sensors that was designed with the explicit goal to improve the integrability and usability of event sensing technology in an embedded at-the-edge vision system. Particular emphasis has been put on event data pre-processing and formatting, data interface compatibility, and low-latency connectivity to various processing platforms including low-power uCs and neuromorphic processor architectures. Furthermore, the sensor has been optimized for ultra-low power operation, featuring a hierarchy of low-power modes and application-specific modes of operation. On-chip power management and an embedded uC core further improve sensor flexibility and useability at the edge.

source

Authorization
*
*
Password generation