Hyperparameter Optimization in NLP Architectures
Noureddine Ettaik, Ben Lahmar El Habib
2021
Abstract
Hyperparameter optimization (HPO) is an essential part of setting up efficient machine learning models dealing with natural language processing (NLP) tasks, especially with the recent NLP breakthroughs. In this paper, we explore the problem of HPO through a survey conducted on a selected number of academic publications in NLP by studying the strategy used for the optimization of their hyperparameters and by investigating their common traits. We then lay out some recommendations for good practice in NLP HPO.
DownloadPaper Citation
in Harvard Style
Ettaik N. and El Habib B. (2021). Hyperparameter Optimization in NLP Architectures. In Proceedings of the 2nd International Conference on Big Data, Modelling and Machine Learning - Volume 1: BML, ISBN 978-989-758-559-3, pages 466-470. DOI: 10.5220/0010736600003101
in Bibtex Style
@conference{bml21,
author={Noureddine Ettaik and Ben Lahmar El Habib},
title={Hyperparameter Optimization in NLP Architectures},
booktitle={Proceedings of the 2nd International Conference on Big Data, Modelling and Machine Learning - Volume 1: BML,},
year={2021},
pages={466-470},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0010736600003101},
isbn={978-989-758-559-3},
}
in EndNote Style
TY - CONF
JO - Proceedings of the 2nd International Conference on Big Data, Modelling and Machine Learning - Volume 1: BML,
TI - Hyperparameter Optimization in NLP Architectures
SN - 978-989-758-559-3
AU - Ettaik N.
AU - El Habib B.
PY - 2021
SP - 466
EP - 470
DO - 10.5220/0010736600003101