loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Yemineni Ashok ; Mukesh Rohil ; Kshitij Tandon and Harshil Sethi

Affiliation: Birla Institute of Technology And Science, Pilani (BITS Pilani), Pilani, Rajasthan, India

Keyword(s): Visual Odometry, Wearable Computing, Augmented Reality, Mixed Reality, Pose Estimation, Simultaneous Localization and Mapping.

Abstract: Visual Odometry/ Simultaneous Localization and Mapping (VO/ SLAM) and Egocentric hand gesture recognition are the two major technologies for wearable computing devices like AR (Augmented Reality)/ MR (Mixed Reality) glasses. However, the AR/MR community lacks a suitable dataset for developing both hand gesture recognition and RGB-D SLAM methods. In this work, we use a ZED mini Camera to develop challenging benchmarks for RGB-D VO/ SLAM tasks and dynamic hand gesture recognition. In our dataset VOEDHgesture, we collected 264 sequences using a ZED mini camera, along with precisely measured and time-synchronized ground truth camera positions, and manually annotated the bounding box values for the hand region of interest. The sequences comprise both RGB and depth images, captured at HD resolution (1920 × 1080) and recorded at a video frame rate of 30Hz. To resemble the Augmented Reality environment, the sequences are captured using a head-mounted ZED mini camera, with unrestricted 6-DOF (degree of freedom) movements in different varieties of scenes and camera motions, i.e. indoor, outdoor, slow motion, quick motions, long trajectories, loop closures etc. This dataset can help researchers to develop and promote reproducible research in the fields of egocentric hand tracking, visual odometry/SLAM and computer vision algorithms for AR scene reconstruction and scene understanding, etc. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.138.69.101

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Ashok, Y. ; Rohil, M. ; Tandon, K. and Sethi, H. (2024). VOEDHgesture: A Multi-Purpose Visual Odometry/ Simultaneous Localization and Mapping and Egocentric Dynamic Hand Gesture Data-Set for Virtual Object Manipulations in Wearable Mixed Reality. In Proceedings of the 16th International Conference on Agents and Artificial Intelligence - Volume 3: ICAART; ISBN 978-989-758-680-4; ISSN 2184-433X, SciTePress, pages 1336-1344. DOI: 10.5220/0012473900003636

@conference{icaart24,
author={Yemineni Ashok and Mukesh Rohil and Kshitij Tandon and Harshil Sethi},
title={VOEDHgesture: A Multi-Purpose Visual Odometry/ Simultaneous Localization and Mapping and Egocentric Dynamic Hand Gesture Data-Set for Virtual Object Manipulations in Wearable Mixed Reality},
booktitle={Proceedings of the 16th International Conference on Agents and Artificial Intelligence - Volume 3: ICAART},
year={2024},
pages={1336-1344},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0012473900003636},
isbn={978-989-758-680-4},
issn={2184-433X},
}

TY - CONF

JO - Proceedings of the 16th International Conference on Agents and Artificial Intelligence - Volume 3: ICAART
TI - VOEDHgesture: A Multi-Purpose Visual Odometry/ Simultaneous Localization and Mapping and Egocentric Dynamic Hand Gesture Data-Set for Virtual Object Manipulations in Wearable Mixed Reality
SN - 978-989-758-680-4
IS - 2184-433X
AU - Ashok, Y.
AU - Rohil, M.
AU - Tandon, K.
AU - Sethi, H.
PY - 2024
SP - 1336
EP - 1344
DO - 10.5220/0012473900003636
PB - SciTePress