
Buchmann, V., Violich, S., Billinghurst, M., and Cockburn,
A. (2004). Fingartips: gesture based direct manipula-
tion in augmented reality. In Conference on Computer
Graphics and Interactive Techniques in Australasia
and Southeast Asia.
Cao, Z., Hidalgo Martinez, G., Simon, T., Wei, S., and
Sheikh, Y. A. (2019). Openpose: Realtime multi-
person 2d pose estimation using part affinity fields.
IEEE Transactions on Pattern Analysis and Machine
Intelligence.
Dewez, D., Hoyet, L., L
´
ecuyer, A., and Argelaguet, F.
(2023). Do you need another hand? investigating dual
body representations during anisomorphic 3d manipu-
lation. IEEE Transactions on Visualization and Com-
puter Graphics.
Fern
´
andez, U. J., Elizondo, S., Iriarte, N., Morales, R.,
Ortiz, A., Marichal, S., Ardaiz, O., and Marzo, A.
(2022). A multi-object grasp technique for placement
of objects in virtual reality. Applied Sciences, 12(9).
Geng, Z., Sun, K., Xiao, B., Zhang, Z., and Wang, J.
(2021). Bottom-up human pose estimation via dis-
entangled keypoint regression. 2021 IEEE/CVF Con-
ference on Computer Vision and Pattern Recognition
(CVPR), pages 14671–14681.
Halliday, D., Resnick, R., and Walker, J. (2013). Funda-
mentals of physics. John Wiley & Sons.
Han, D., Lee, R., Kim, K., and Kang, H. (2023). Vr-
handnet: A visually and physically plausible hand ma-
nipulation system in virtual reality. IEEE Transactions
on Visualization and Computer Graphics, pages 1–12.
Han, S., Liu, B., Cabezas, R., Twigg, C. D., Zhang, P.,
Petkau, J., Yu, T.-H., Tai, C.-J., Akbay, M., Wang,
Z., Nitzan, A., Dong, G., Ye, Y., Tao, L., Wan, C., and
Wang, R. (2020). Megatrack: Monochrome egocen-
tric articulated hand-tracking for virtual reality. ACM
Trans. Graph., 39(4).
Han, S., Wu, P.-C., Zhang, Y., Liu, B., Zhang, L., Wang,
Z., Si, W., Zhang, P., Cai, Y., Hodan, T., Cabezas, R.,
Tran, L., Akbay, M., Yu, T.-H., Keskin, C., and Wang,
R. (2022). Umetrack: Unified multi-view end-to-end
hand tracking for vr. In SIGGRAPH Asia 2022 Con-
ference Papers, SA ’22, New York, NY, USA. Asso-
ciation for Computing Machinery.
H
¨
oll, M., Oberweger, M., Arth, C., and Lepetit, V. (2018).
Efficient physics-based implementation for realistic
hand-object interaction in virtual reality. In Proc. of
Conference on Virtual Reality and 3D User Interfaces.
HTC (2023). Vive hand tracking sdk. Accessed: 2023-10-
03.
Hung, C.-W., Chang, R.-C., Chen, H.-S., Liang, C. H.,
Chan, L., and Chen, B.-Y. (2022). Puppeteer: Explor-
ing intuitive hand gestures and upper-body postures
for manipulating human avatar actions. In Proceed-
ings of the 28th ACM Symposium on Virtual Reality
Software and Technology, VRST ’22, New York, NY,
USA. Association for Computing Machinery.
Kaminski, E., Kepplier, S., and Israel, J. H. (2022). Physics-
based hand-object-interaction for immersive environ-
ments. Mensch und Computer.
Kim, J.-S. and Park, J. M. (2015). Physics-based hand
interaction with virtual objects. 2015 IEEE Inter-
national Conference on Robotics and Automation
(ICRA), pages 3814–3819.
Kumar, V. and Todorov, E. (2015). Mujoco haptix: A virtual
reality system for hand manipulation. 2015 IEEE-RAS
15th International Conference on Humanoid Robots
(Humanoids), pages 657–663.
Liu, H., Zhang, Z., Xie, X., Zhu, Y., Liu, Y., Wan, Y.,
and Zhu, S.-C. (2019). Physics-based hand-object-
interaction for immersive environments. IEEE In-
ternational Conference on Robotics and Automation
(ICRA).
Liu, S., Wu, W.-X., Wu, J., and Lin, Y. (2022). Spatial-
temporal parallel transformer for arm-hand dynamic
estimation. 2022 IEEE/CVF Conference on Com-
puter Vision and Pattern Recognition (CVPR), pages
24091–24100.
Mehta, D., Sotnychenko, O., Mueller, F., Xu, W., Elgharib,
M. A., Fua, P., Seidel, H.-P., Rhodin, H., Pons-Moll,
G., and Theobalt, C. (2020). Xnect: Real-time multi-
person 3d motion capture with a single rgb camera. In
International Conference on Computer Graphics and
Interactive Techniques.
Meta (2022). Interaction sdk overview. Accessed: 2023-
10-03.
Oprea, S., Martinez-Gonzalez, P., Garcia-Garcia, A.,
Castro-Vargas, J. A., Orts-Escolano, S., and Garcia-
Rodriguez, J. (2019). A visually realistic grasping sys-
tem for object manipulation and interaction in virtual
reality environments. Comput. Graph., 83(C):77–86.
Point, N. (2023). Optitrack. Accessed: 2023-10-03.
Prachyabrued, M. and Borst, C. W. (2014). Visual feedback
for virtual grasping. 2014 IEEE Symposium on 3D
User Interfaces (3DUI), pages 19–26.
Sch
¨
afer, A., Reis, G., and Stricker, D. (2022). The gesture
authoring space: Authoring customised hand gestures
for grasping virtual objects in immersive virtual envi-
ronments. Mensch und Computer, pages 1–12.
ultraleap (2023). ultraleap.com. Accessed: 2023-07-23.
Vosinakis, S. and Koutsabasis, P. (2018). Evaluation of
visual feedback techniques for virtual grasping with
bare hands using leap motion and oculus rift. Virtual
Reality, 22:47–62.
Winkler, A., Won, J., and Ye, Y. (2022). Questsim: Human
motion tracking from sparse sensors with simulated
avatars. In SIGGRAPH Asia 2022 Conference Papers,
SA ’22, New York, NY, USA. Association for Com-
puting Machinery.
Yang, D., Kim, D., and Lee, S.-H. (2021). Lobstr: Real-
time lower-body pose prediction from sparse upper-
body tracking signals. Computer Graphics Forum, 40.
Yi, X., Zhou, Y., Habermann, M., Shimada, S., Golyanik,
V., Theobalt, C., and Xu, F. (2022). Physical inertial
poser (pip): Physics-aware real-time human motion
tracking from sparse inertial sensors. In IEEE/CVF
Conference on Computer Vision and Pattern Recogni-
tion (CVPR).
Yin, Z. and Yin, K. (2020). Linear time stable pd con-
trollers for physics-based character animation. Com-
puter Graphics Forum, 39(8):191–200.
Zhang, F., Bazarevsky, V., Vakunov, A., Tkachenka, A.,
Sung, G., Chang, C.-L., and Grundmann, M. (2020).
Mediapipe hands: On-device real-time hand tracking.
ArXiv, abs/2006.10214.
Pure Physics-Based Hand Interaction in VR
235