loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Émilien Arnaud 1 ; Mahmoud Elbattah 2 ; 3 ; Maxime Gignon 1 and Gilles Dequen 3

Affiliations: 1 Emergency Department, Amiens-Picardy University, Amiens, France ; 2 Faculty of Environment and Technology, University of the West of England, Bristol, U.K. ; 3 Laboratoire MIS, Université de Picardie Jules Verne, Amiens, France

Keyword(s): Natural Language Processing, BERT, Transformers, Clustering, Healthcare Analytics.

Abstract: The advent of transformer models has allowed for tremendous progress in the Natural Language Processing (NLP) domain. Pretrained transformers could successfully deliver the state-of-the-art performance in a myriad of NLP tasks. This study presents an application of transformers to learn contextual embeddings from free-text triage notes, widely recorded at the emergency department. A large-scale retrospective cohort of triage notes of more than 260K records was provided by the University Hospital of Amiens-Picardy in France. We utilize a set of Bidirectional Encoder Representations from Transformers (BERT) for the French language. The quality of embeddings is empirically examined based on a set of clustering models. In this regard, we provide a comparative analysis of popular models including CamemBERT, FlauBERT, and mBART. The study could be generally regarded as an addition to the ongoing contributions of applying the BERT approach in the healthcare context.

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.145.176.228

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Arnaud, É.; Elbattah, M.; Gignon, M. and Dequen, G. (2022). Learning Embeddings from Free-text Triage Notes using Pretrained Transformer Models. In Proceedings of the 15th International Joint Conference on Biomedical Engineering Systems and Technologies - Scale-IT-up; ISBN 978-989-758-552-4; ISSN 2184-4305, SciTePress, pages 835-841. DOI: 10.5220/0011012800003123

@conference{scale-it-up22,
author={Émilien Arnaud. and Mahmoud Elbattah. and Maxime Gignon. and Gilles Dequen.},
title={Learning Embeddings from Free-text Triage Notes using Pretrained Transformer Models},
booktitle={Proceedings of the 15th International Joint Conference on Biomedical Engineering Systems and Technologies - Scale-IT-up},
year={2022},
pages={835-841},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0011012800003123},
isbn={978-989-758-552-4},
issn={2184-4305},
}

TY - CONF

JO - Proceedings of the 15th International Joint Conference on Biomedical Engineering Systems and Technologies - Scale-IT-up
TI - Learning Embeddings from Free-text Triage Notes using Pretrained Transformer Models
SN - 978-989-758-552-4
IS - 2184-4305
AU - Arnaud, É.
AU - Elbattah, M.
AU - Gignon, M.
AU - Dequen, G.
PY - 2022
SP - 835
EP - 841
DO - 10.5220/0011012800003123
PB - SciTePress