loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Pervaiz Khan 1 ; 2 ; Andreas Dengel 1 ; 2 and Sheraz Ahmed 1

Affiliations: 1 German Research Center for Artificial Intelligence (DFKI), 67663 Kaiserslautern, Germany ; 2 Department of Computer Science, TU Kaiserslautern, 67663 Kaiserslautern, Germany

Keyword(s): Random Noise, Knowledge Distillation, Text Classification.

Abstract: Finetuning foundation models effectively on downstream tasks is ongoing research. In this paper, we present a finetuning method “Randout-KD” that enhances the performance of a student model for text classification. We specifically propose a noise-injecting method in the representations of the transformer model during its finetuning that works as regularization. Moreover, we integrate the knowledge distillation and noise injection methods and show that combining these approaches boosts the baseline model performance. We evaluate the proposed method on two datasets namely “CODA-19” and “RHMD” using PubMedBERT and RoBERTa Large as teacher models, and data2vec as a student model. Results show that the proposed approach improves the accuracy up to 1.2% compared to the baseline methods.

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 18.188.126.246

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Khan, P. ; Dengel, A. and Ahmed, S. (2023). Randout-KD: Finetuning Foundation Models for Text Classification via Random Noise and Knowledge Distillation. In Proceedings of the 15th International Conference on Agents and Artificial Intelligence - Volume 3: ICAART; ISBN 978-989-758-623-1; ISSN 2184-433X, SciTePress, pages 457-465. DOI: 10.5220/0011687800003393

@conference{icaart23,
author={Pervaiz Khan and Andreas Dengel and Sheraz Ahmed},
title={Randout-KD: Finetuning Foundation Models for Text Classification via Random Noise and Knowledge Distillation},
booktitle={Proceedings of the 15th International Conference on Agents and Artificial Intelligence - Volume 3: ICAART},
year={2023},
pages={457-465},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0011687800003393},
isbn={978-989-758-623-1},
issn={2184-433X},
}

TY - CONF

JO - Proceedings of the 15th International Conference on Agents and Artificial Intelligence - Volume 3: ICAART
TI - Randout-KD: Finetuning Foundation Models for Text Classification via Random Noise and Knowledge Distillation
SN - 978-989-758-623-1
IS - 2184-433X
AU - Khan, P.
AU - Dengel, A.
AU - Ahmed, S.
PY - 2023
SP - 457
EP - 465
DO - 10.5220/0011687800003393
PB - SciTePress