Machine Learning

tinyML EMEA – Michele Magno: Ultra-Fast, Energy-Efficient Neuromorphic Edge Processing For Event…



Ultra-Fast, Energy-Efficient Neuromorphic Edge Processing For Event-Based and Frame-Based Cameras: ColibriUAV and Eye-tracking
Michele MAGNO
Head of the Project-based learning Center
ETH Zurich, D-ITET

The interest in dynamic vision sensor (DVS) for a wide range of application is raising, especially due to the microsecond-level reaction time of the bio-inspired event sensor, which increases robustness and reduces latency of the perception tasks compared to a RGB camera. This talk presents two embedded platforms ColibriUAV, a UAV platform with both frame-based and event-based cameras interfaces for efficient perception and near-sensor processing and ColibriEYE. The proposed platforms platform are designed around Kraken, a novel low-power RISC-V System on Chip with two hardware accelerators targeting spiking neural networks and deep ternary neural networks. Kraken is capable of efficiently processing both event data from a DVS camera and frame data from an RGB camera. A key feature of Kraken is its integrated, dedicated interface with a DVS camera from Inivation. This talk benchmarks the end-to-end latency and power efficiency of the neuromorphic and event-based UAV subsystem, demonstrating state-of-the-art event data with a throughput of 7200 frames of events per second and a power consumption of 10.7 mW, which is over 6.6 times faster and a hundred times less power-consuming than the widely-used data reading approach through the USB interface. The overall sensing and processing power consumption is below 50 mW, achieving latency in the milliseconds range, making the platform suitable for low-latency autonomous nano-drones as well eye tracking. Potentially comparison with other commercial and academic processor will be presented during the talk.

source

Authorization
*
*
Password generation