loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Qinghua Guo 1 ; Dawei Pei 1 ; Yue Sun 1 ; 2 ; Patrick P. J. H. Langenhuizen 1 ; Clémence A. E. M. Orsini 3 ; Kristine Hov Martinsen 4 ; Øyvind Nordbø 4 ; J. Bolhuis 3 ; Piter Bijma 3 and Peter H. N. de With 1

Affiliations: 1 Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands ; 2 Faculty of Applied Science, Macao Polytechnic University, Macao Special Administrative Region of China ; 3 Department of Animal Sciences, Wageningen University & Research, Wageningen, The Netherlands ; 4 Norsvin SA, Hamar, Norway

Keyword(s): Animal Keypoint Detection, Animal Posture Recognition, Multi-Object Surveillance.

Abstract: Monitoring the daily status of pigs is crucial for enhancing their health and welfare. Pose estimation has emerged as an effective method for tracking pig postures, with keypoint detection and skeleton extraction playing pivotal roles in this process. Despite advancements in human pose estimation, there is limited research focused on pigs. To bridge this gap, this study applies the You Only Look Once model Version 8 (YOLOv8) for keypoint detection and skeleton extraction, evaluated on a manually annotated pig dataset. Additionally, the performance of pose estimation is compared across different data modalities and models, including an image-based model (ResNet-18), a keypoint-based model (Multi-Layer Perceptron, MLP), and a combined image-and-keypoint-based model (YOLOv8-pose). The keypoint detection branch achieves an average Percentage of Detected Joints (PDJ) of 48.96%, an average Percentage of Correct Keypoints (PCK) of 84.85%, and an average Object Keypoint Similarity (OKS) of 8 9.43%. The best overall accuracy obtained for pose estimation is 99.33% by the YOLOv8-pose model, which indicates the superiority of the joint image-keypoint-based model for pose estimation. The conducted comprehensive experiments and visualization results indicate that the proposed method effectively identifies specific pig body parts in most monitoring frames, facilitating an accurate assessment of pig activity and welfare. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 18.219.79.34

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Guo, Q., Pei, D., Sun, Y., Langenhuizen, P. P. J. H., Orsini, C. A. E. M., Martinsen, K. H., Nordbø, Ø., Bolhuis, J., Bijma, P. and N. de With, P. H. (2025). Multi-Object Keypoint Detection and Pose Estimation for Pigs. In Proceedings of the 20th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 3: VISAPP; ISBN 978-989-758-728-3; ISSN 2184-4321, SciTePress, pages 466-474. DOI: 10.5220/0013170100003912

@conference{visapp25,
author={Qinghua Guo and Dawei Pei and Yue Sun and Patrick P. J. H. Langenhuizen and Clémence A. E. M. Orsini and Kristine Hov Martinsen and Øyvind Nordbø and J. Bolhuis and Piter Bijma and Peter H. {N. de With}},
title={Multi-Object Keypoint Detection and Pose Estimation for Pigs},
booktitle={Proceedings of the 20th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 3: VISAPP},
year={2025},
pages={466-474},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0013170100003912},
isbn={978-989-758-728-3},
issn={2184-4321},
}

TY - CONF

JO - Proceedings of the 20th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 3: VISAPP
TI - Multi-Object Keypoint Detection and Pose Estimation for Pigs
SN - 978-989-758-728-3
IS - 2184-4321
AU - Guo, Q.
AU - Pei, D.
AU - Sun, Y.
AU - Langenhuizen, P.
AU - Orsini, C.
AU - Martinsen, K.
AU - Nordbø, Ø.
AU - Bolhuis, J.
AU - Bijma, P.
AU - N. de With, P.
PY - 2025
SP - 466
EP - 474
DO - 10.5220/0013170100003912
PB - SciTePress