tinyML Talks: Train-by-weight (TBW): Accelerated Deep Learning by Data Dimensionality Reduction



tinyML Talks recorded April 27, 2021
Train-by-weight (TBW): Accelerated Deep Learning by Data Dimensionality Reduction

Xingheng Lin and Michael Jo
Electrical and Computer Engineering
Rose-Hulman Institute of Technology

The state-of-the-arts pretrained machine/deep learning (M/DL) models are available in tinyML community for numerous applications. However, training these models for new objects and retraining the pretrained models are computationally expensive.
Our proposed Train-by-Weight (TBW) approach is a combination of linear classifier, such as principal component analysis (PCA), and a nonlinear classifier, such as deep learning model. There are two key contributions in this approach. First, we perform dimensionality reduction by generating weighted data sets using linear classifiers. Secondly, weighted data sets offer essential data sets to M/DL model. As a result, we achieved reduced training and verification time by maximum 88% in deep artificial neural network model with approximately 1% accuracy loss.
TinyML community may benefit from the proposed approach by faster training of M/DL models due to lower bandwidth of data. Moreover, this may offer energy efficient hardware/software solutions due to its relatively simple architecture.

source

5 thoughts on “tinyML Talks: Train-by-weight (TBW): Accelerated Deep Learning by Data Dimensionality Reduction

Leave a Reply

Your email address will not be published.