loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Ryota Ikedo ; Kotaro Nagata and Kazuhiro Hotta

Affiliation: Meijo University, 1-501 Shiogamaguchi, Tempaku-ku, Nagoya 468-8502, Japan

Keyword(s): Knowledge Distillation, Class-wise, Semantic Segmentation.

Abstract: In recent years, we have been improving the accuracy of semantic segmentation by deepening segmentation models, but large amount of computational resources are required due to the increase in computational complexity. Therefore knowledge distillation has been studied as one of model compression methods. We propose a knowledge distillation method in which the output distribution of a teacher model learned for each class is used as a target of the student model for the purpose of memory compression and accuracy improvement. Experimental results demonstrate that the segmentation accuracy was improved without increasing the computational cost on two different datasets.

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 18.116.21.109

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Ikedo, R.; Nagata, K. and Hotta, K. (2023). Class-wise Knowledge Distillation for Lightweight Segmentation Model. In Proceedings of the 16th International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2023) - BIOSIGNALS; ISBN 978-989-758-631-6; ISSN 2184-4305, SciTePress, pages 287-293. DOI: 10.5220/0011719900003414

@conference{biosignals23,
author={Ryota Ikedo. and Kotaro Nagata. and Kazuhiro Hotta.},
title={Class-wise Knowledge Distillation for Lightweight Segmentation Model},
booktitle={Proceedings of the 16th International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2023) - BIOSIGNALS},
year={2023},
pages={287-293},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0011719900003414},
isbn={978-989-758-631-6},
issn={2184-4305},
}

TY - CONF

JO - Proceedings of the 16th International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2023) - BIOSIGNALS
TI - Class-wise Knowledge Distillation for Lightweight Segmentation Model
SN - 978-989-758-631-6
IS - 2184-4305
AU - Ikedo, R.
AU - Nagata, K.
AU - Hotta, K.
PY - 2023
SP - 287
EP - 293
DO - 10.5220/0011719900003414
PB - SciTePress