tinyML Research Symposium: MetaLDC: Meta Learning of Low-Dimensional Computing Classifiers for…



https://www.tinyml.org/event/research-symposium-2023/#schedule
MetaLDC: Meta Learning of Low-Dimensional Computing Classifiers for Fast On-Device Adaption
Yejia LIU, PhD Student, University of California Riverside
Fast model updates for unseen tasks on intelligent edge devices are crucial but also challenging due to the limited computational power. In this paper, we propose MetaLDC, which meta trains brain- inspired ultra-efficient low-dimensional computing classifiers to enable fast adaptation on tiny devices with minimal computational costs. Concretely, during the meta-training stage, MetaLDC meta trains a representation offline by explicitly taking into account that the final (binary) class layer will be fine-tuned for fast adaptation for unseen tasks on tiny devices; during the meta-testing stage, MetaLDC uses closed-form gradients of the loss function to enable fast adaptation of the class layer. Unlike traditional neural networks, MetaLDC is designed based on the emerging LDC framework to en- able ultra-efficient inference. Our experiments have demonstrated that compared to SOTA baselines, MetaLDC has a better accuracy, robustness against random bit errors as well as cost efficiency on hardware computation.

source

Authorization
*
*
Password generation