loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Juraj Vladika ; Alexander Fichtl and Florian Matthes

Affiliation: Department of Computer Science, Technical University of Munich, Boltzmannstraße 3, 85748 Garching bei München, Germany

Keyword(s): Natural Language Processing (NLP), Pre-Trained Language Models, Knowledge Graphs, Domain Knowledge, Knowledge Enhancement, Adapters, Biomedicine, Biomedical NLP.

Abstract: Recent advances in natural language processing (NLP) owe their success to pre-training language models on large amounts of unstructured data. Still, there is an increasing effort to combine the unstructured nature of LMs with structured knowledge and reasoning. Particularly in the rapidly evolving field of biomedical NLP, knowledge-enhanced language models (KELMs) have emerged as promising tools to bridge the gap between large language models and domain-specific knowledge, considering the available biomedical knowledge graphs (KGs) curated by experts over the decades. In this paper, we develop an approach that uses lightweight adapter modules to inject structured biomedical knowledge into pre-trained language models (PLMs). We use two large KGs, the biomedical knowledge system UMLS and the novel biochemical ontology OntoChem, with two prominent biomedical PLMs, PubMedBERT and BioLinkBERT. The approach includes partitioning knowledge graphs into smaller subgraphs, fine-tuning adapter modules for each subgraph, and combining the knowledge in a fusion layer. We test the performance on three downstream tasks: document classification, question answering, and natural language inference. We show that our methodology leads to performance improvements in several instances while keeping requirements in computing power low. Finally, we provide a detailed interpretation of the results and report valuable insights for future work. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 18.225.149.158

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Vladika, J.; Fichtl, A. and Matthes, F. (2024). Diversifying Knowledge Enhancement of Biomedical Language Models Using Adapter Modules and Knowledge Graphs. In Proceedings of the 16th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART; ISBN 978-989-758-680-4; ISSN 2184-433X, SciTePress, pages 376-387. DOI: 10.5220/0012395200003636

@conference{icaart24,
author={Juraj Vladika. and Alexander Fichtl. and Florian Matthes.},
title={Diversifying Knowledge Enhancement of Biomedical Language Models Using Adapter Modules and Knowledge Graphs},
booktitle={Proceedings of the 16th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART},
year={2024},
pages={376-387},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0012395200003636},
isbn={978-989-758-680-4},
issn={2184-433X},
}

TY - CONF

JO - Proceedings of the 16th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART
TI - Diversifying Knowledge Enhancement of Biomedical Language Models Using Adapter Modules and Knowledge Graphs
SN - 978-989-758-680-4
IS - 2184-433X
AU - Vladika, J.
AU - Fichtl, A.
AU - Matthes, F.
PY - 2024
SP - 376
EP - 387
DO - 10.5220/0012395200003636
PB - SciTePress