Authors:
Thrupthi Ann John
1
;
Isha Dua
1
;
Vineeth N. Balasubramanian
2
and
C. V. Jawahar
1
Affiliations:
1
Center for Visual Information Technology, International Institute of Information Technology, Hyderabad, India
;
2
Department of Computer Science and Engineering, Indian Institute of Technology, Hyderabad, India
Keyword(s):
Face Tasks, Transfer Learning, Efficient Transfer Learning, Face Recognition, Expression Recognition, Age Prediction, Gender Prediction, Head Pose.
Abstract:
Transfer learning is a popular method for obtaining deep trained models for data-scarce face tasks such as head pose and emotion. However, current transfer learning methods are inefficient and time-consuming as they do not fully account for the relationships between related tasks. Moreover, the transferred model is large and computationally expensive. As an alternative, we propose ETL: a technique that efficiently transfers a pre-trained model to a new task by retaining only cross-task aware filters, resulting in a sparse transferred model. We demonstrate the effectiveness of ETL by transferring VGGFace, a popular face recognition model to four diverse face tasks. Our experiments show that we attain a size reduction up to 97% and an inference time reduction up to 94% while retaining 99.5% of the baseline transfer learning accuracy.