This paper provides additional background on a new progressive learning approach for deep neural networks called Deep Rapid Class Augmentation (Deep RCA). The goal of progressive learning is to reuse knowledge acquired from previous training to reduce the training time required to add additional classes to an existing model. A key benefit of the Deep RCA approach is that it only requires training data from the new class in order to optimally update all classes in the existing model. This allows Deep RCA to train large models with new classes much faster than conventional training techniques. Specifically, this paper shows a 1700x reduction in training time when Deep RCA is used to augment a new class onto a 19-class base model compared to the time required to train a similar model from scratch without leveraging any form of progressive or transfer learning. In addition, Deep RCA shows a 35x reduction in training time over current progressive learning techniques that employ a stochastic gradient descent style of optimization. The results show that Deep RCA provides a faster augmentation method than current training techniques, which is expected to be beneficial for applications that require real-time, continuous learning of new classes for perception tasks.
Deep Rapid Class Augmentation A Progressive Learning Approach for Deep Neural Networks
2022-03-05
4409462 byte
Conference paper
Electronic Resource
English