Authors:
Nermeen Abou Baker
and
Uwe Handmann
Affiliation:
Computer Science Institute, Ruhr West University of Applied Science, Luetzowstr. 5, Bottrop, Germany
Keyword(s):
Transfer Learning, Pretrained Model Selection, Transferability Metrics, Waste Classification.
Abstract:
Waste streams are growing rapidly due to higher consumption rates, and they present repeating patterns that can be classified with high accuracy due to advances in computer vision. However, collecting and annotating large datasets is time-consuming, but transfer learning can overcome this problem. Selecting the most appropriate pretrained model is critical to maximizing the benefits of transfer learning. Transferability metrics provide an efficient way to evaluate pretrained models without extensive retraining or brute-force methods. This study evaluates six transferability metrics for model selection in waste classification: Negative Conditional Entropy (NCE), Log Expected Empirical Prediction (LEEP), Logarithm of Maximum Evidence (LogME), TransRate, Gaussian Bhattacharyya Coefficient (GBC), and ImageNet accuracy. We evaluate these metrics on five waste classification datasets using 11 pretrained ImageNet models, comparing their performance for finetuning and head-training approache
s. Results show that LogME correlates best with transfer accuracy for larger datasets, while ImageNet accuracy and TransRate are more effective for smaller datasets. Our method achieves up to 364x speed-up over brute-force selection, which demonstrates significant efficiency in practical applications.
(More)