Transformers for Low-resource Neural Machine Translation
Andargachew Gezmu, Andreas Nürnberger
2022
Abstract
The recent advances in neural machine translation enable it to be state-of-the-art. However, although there are significant improvements in neural machine translation for a few high-resource languages, its performance is still low for less-resourced languages as the amount of training data significantly affects the quality of the machine translation models. Therefore, identifying a neural machine translation architecture that can train the best models in low-data conditions is essential for less-resourced languages. This research modified the Transformer-based neural machine translation architectures for low-resource polysynthetic languages. Our proposed system outperformed the strong baseline in the automatic evaluation of the experiments on the public benchmark datasets.
DownloadPaper Citation
in Harvard Style
Gezmu A. and Nürnberger A. (2022). Transformers for Low-resource Neural Machine Translation. In Proceedings of the 14th International Conference on Agents and Artificial Intelligence - Volume 1: NLPinAI, ISBN 978-989-758-547-0, pages 459-466. DOI: 10.5220/0010971500003116
in Bibtex Style
@conference{nlpinai22,
author={Andargachew Gezmu and Andreas Nürnberger},
title={Transformers for Low-resource Neural Machine Translation},
booktitle={Proceedings of the 14th International Conference on Agents and Artificial Intelligence - Volume 1: NLPinAI,},
year={2022},
pages={459-466},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0010971500003116},
isbn={978-989-758-547-0},
}
in EndNote Style
TY - CONF
JO - Proceedings of the 14th International Conference on Agents and Artificial Intelligence - Volume 1: NLPinAI,
TI - Transformers for Low-resource Neural Machine Translation
SN - 978-989-758-547-0
AU - Gezmu A.
AU - Nürnberger A.
PY - 2022
SP - 459
EP - 466
DO - 10.5220/0010971500003116