A Group Activity Based Method for Early Recognition of Surgical Processes Using the Camera Observing Surgeries in an Operating Room and Spatio-Temporal Graph Based Deep Learning Model

Keishi Nishikawa, Jun Ohya

2025

Abstract

Towards the realization of the scrub-nurse robot, this paper proposes a group activity-based method for the early recognition of surgical processes using an early part of the video acquired by the camera observing surgeries. Our proposed method consists of two steps. In the first step, we construct a spatial-temporal graphs which represents the group activity in operating room. The graph’s node contains (a) the visual features of participants and (b) the positions. In the second step, the generated graphs are input to our model for classification of the input. In our model, since the generated graph’ node contains both visual features and the position information, we treat the graph as the point cloud in spatial-temporal space. Therefore, Point Transformer Layer from (Zhao et al., 2021) is used as the building block. Experiments are conducted on public datasets; (Özsoy et al., 2022)’s mock surgery of knee replacement. The results show our method performs early recognition achieving the accuracy of 68.2 %9̃0.0 % in early duration such as 17.1 % ̃ 34.1 % of the entire durations from the beginning on the dataset. Furthermore, the comparison with the state-of-the-art method (Zhai et al., 2023) in early recognition of group activity is also conducted. It turns out that ours outperforms (Zhai et al., 2023) significantly.

Download


Paper Citation


in Harvard Style

Nishikawa K. and Ohya J. (2025). A Group Activity Based Method for Early Recognition of Surgical Processes Using the Camera Observing Surgeries in an Operating Room and Spatio-Temporal Graph Based Deep Learning Model. In Proceedings of the 14th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM; ISBN 978-989-758-730-6, SciTePress, pages 712-724. DOI: 10.5220/0013252700003905


in Bibtex Style

@conference{icpram25,
author={Keishi Nishikawa and Jun Ohya},
title={A Group Activity Based Method for Early Recognition of Surgical Processes Using the Camera Observing Surgeries in an Operating Room and Spatio-Temporal Graph Based Deep Learning Model},
booktitle={Proceedings of the 14th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM},
year={2025},
pages={712-724},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0013252700003905},
isbn={978-989-758-730-6},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 14th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM
TI - A Group Activity Based Method for Early Recognition of Surgical Processes Using the Camera Observing Surgeries in an Operating Room and Spatio-Temporal Graph Based Deep Learning Model
SN - 978-989-758-730-6
AU - Nishikawa K.
AU - Ohya J.
PY - 2025
SP - 712
EP - 724
DO - 10.5220/0013252700003905
PB - SciTePress