loading
Papers

Research.Publish.Connect.

Paper

Authors: Yehezkel Resheff 1 ; Itay Lieder 2 and Tom Hope 2

Affiliations: 1 Intuit Tech Futures and Israel ; 2 Intel Advanced Analytics and Israel

ISBN: 978-989-758-351-3

Keyword(s): Deep Learning, Fusion.

Related Ontology Subjects/Areas/Topics: Feature Selection and Extraction ; Pattern Recognition ; Theory and Methods

Abstract: Pre-trained deep neural networks, powerful models trained on large datasets, have become a popular tool in computer vision for transfer learning. However, the standard approach of using a single network potentially misses out on valuable information contained in other readily available models. In this work, we study the Mixture of Experts (MoE) approach for adaptively fusing multiple pre-trained models for each individual input image. In particular, we explore how far we can get by combining diverse pre-trained representations in a customized way that maximizes their potential in a lightweight framework. Our approach is motivated by an empirical study of the predictions made by popular pre-trained nets across various datasets, finding that both performance and agreement between models vary across datasets. We further propose a miniature CNN gating mechanism operating on a thumbnail version of the input image, and show this is enough to guide a good fusion. Finally, we explore a multi- modal blend of visual and natural-language representations, using a label-space embedding to inject pre-trained word-vectors. Across multiple datasets, we demonstrate that an adaptive fusion of pre-trained models can obtain favorable results. (More)

PDF ImageFull Text

Download
CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.91.106.223

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Resheff, Y.; Lieder, I. and Hope, T. (2019). All Together Now! The Benefits of Adaptively Fusing Pre-trained Deep Representations.In Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM, ISBN 978-989-758-351-3, pages 135-144. DOI: 10.5220/0007367301350144

@conference{icpram19,
author={Yehezkel S. Resheff. and Itay Lieder. and Tom Hope.},
title={All Together Now! The Benefits of Adaptively Fusing Pre-trained Deep Representations},
booktitle={Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,},
year={2019},
pages={135-144},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0007367301350144},
isbn={978-989-758-351-3},
}

TY - CONF

JO - Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,
TI - All Together Now! The Benefits of Adaptively Fusing Pre-trained Deep Representations
SN - 978-989-758-351-3
AU - Resheff, Y.
AU - Lieder, I.
AU - Hope, T.
PY - 2019
SP - 135
EP - 144
DO - 10.5220/0007367301350144

Login or register to post comments.

Comments on this Paper: Be the first to review this paper.