Class-wise Knowledge Distillation for Lightweight Segmentation Model
Ryota Ikedo, Kotaro Nagata, Kazuhiro Hotta
2023
Abstract
In recent years, we have been improving the accuracy of semantic segmentation by deepening segmentation models, but large amount of computational resources are required due to the increase in computational complexity. Therefore knowledge distillation has been studied as one of model compression methods. We propose a knowledge distillation method in which the output distribution of a teacher model learned for each class is used as a target of the student model for the purpose of memory compression and accuracy improvement. Experimental results demonstrate that the segmentation accuracy was improved without increasing the computational cost on two different datasets.
DownloadPaper Citation
in Harvard Style
Ikedo R., Nagata K. and Hotta K. (2023). Class-wise Knowledge Distillation for Lightweight Segmentation Model. In Proceedings of the 16th International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2023) - Volume 4: BIOSIGNALS; ISBN 978-989-758-631-6, SciTePress, pages 287-293. DOI: 10.5220/0011719900003414
in Bibtex Style
@conference{biosignals23,
author={Ryota Ikedo and Kotaro Nagata and Kazuhiro Hotta},
title={Class-wise Knowledge Distillation for Lightweight Segmentation Model},
booktitle={Proceedings of the 16th International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2023) - Volume 4: BIOSIGNALS},
year={2023},
pages={287-293},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0011719900003414},
isbn={978-989-758-631-6},
}
in EndNote Style
TY - CONF
JO - Proceedings of the 16th International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2023) - Volume 4: BIOSIGNALS
TI - Class-wise Knowledge Distillation for Lightweight Segmentation Model
SN - 978-989-758-631-6
AU - Ikedo R.
AU - Nagata K.
AU - Hotta K.
PY - 2023
SP - 287
EP - 293
DO - 10.5220/0011719900003414
PB - SciTePress