All Together Now! The Benefits of Adaptively Fusing Pre-trained Deep Representations

Yehezkel S. Resheff, Itay Lieder, Tom Hope

2019

Abstract

Pre-trained deep neural networks, powerful models trained on large datasets, have become a popular tool in computer vision for transfer learning. However, the standard approach of using a single network potentially misses out on valuable information contained in other readily available models. In this work, we study the Mixture of Experts (MoE) approach for adaptively fusing multiple pre-trained models for each individual input image. In particular, we explore how far we can get by combining diverse pre-trained representations in a customized way that maximizes their potential in a lightweight framework. Our approach is motivated by an empirical study of the predictions made by popular pre-trained nets across various datasets, finding that both performance and agreement between models vary across datasets. We further propose a miniature CNN gating mechanism operating on a thumbnail version of the input image, and show this is enough to guide a good fusion. Finally, we explore a multi-modal blend of visual and natural-language representations, using a label-space embedding to inject pre-trained word-vectors. Across multiple datasets, we demonstrate that an adaptive fusion of pre-trained models can obtain favorable results.

Download


Paper Citation


in Harvard Style

Resheff Y., Lieder I. and Hope T. (2019). All Together Now! The Benefits of Adaptively Fusing Pre-trained Deep Representations.In Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM, ISBN 978-989-758-351-3, pages 135-144. DOI: 10.5220/0007367301350144


in Bibtex Style

@conference{icpram19,
author={Yehezkel Resheff and Itay Lieder and Tom Hope},
title={All Together Now! The Benefits of Adaptively Fusing Pre-trained Deep Representations},
booktitle={Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,},
year={2019},
pages={135-144},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0007367301350144},
isbn={978-989-758-351-3},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,
TI - All Together Now! The Benefits of Adaptively Fusing Pre-trained Deep Representations
SN - 978-989-758-351-3
AU - Resheff Y.
AU - Lieder I.
AU - Hope T.
PY - 2019
SP - 135
EP - 144
DO - 10.5220/0007367301350144