Microsoft-Hololense
2
and Holo-Stylus
3
are just two
of them. However, all of them are build on special-
ized hardware and proprietary software and there are
expensive. On the other side, smartphones are contin-
uously evolving, adding more computer power, more
sensors, and high-quality display. Multi cameras and
depth sensors are some of their recent additions. There-
fore, we expect that it will be possible to implement
all the functionalities of an AR system just in a smart-
phone. In this case, computing power will be in de-
mand. We will need to develop new fast and efficient
algorithms. One way to achieve this is to make them
task-specific. cMinMax is such an example, where
we can find the corners of a marker (convex quad-
rangle) almost ten times faster than the commonly
used Harris Corner Detection algorithm. The fusion
of data obtained from different mobile sensors (multi-
ple RGB cameras, Depth Camera, Ultrasound sensor,
Three
-
axis gyroscope, Accelerometer, Proximity sen-
sor, e.t.c) to locate in real-time 3D objects in 3D space
and register them to the virtual world is another chal-
lenging task. A simple example is presented in subsub-
section 4.4.3, where we combine data from an RGB
and a Depth camera in order to find the 3D coordinates
of a small ball (approximated with a point) in space.
7 CONCLUSIONS
This paper has presented the implementation of an in-
expensive single-user realization of a system with a
3D tangible user interface build with off the selves
components. This system is easy to implement, it runs
in real-time and it is suitable to use as an experimental
AR testbed where we can try new concepts and meth-
ods. We did optimize its performance either by mov-
ing computational complexity out of the main loop
of operation or by using task-specific fast procedures.
cMinMax, a new algorithm for finding the corners of
a markers mask, is such an example, where we have
sacrifice generality in order to gain speed.
ACKNOWLEDGEMENTS
We would like to thank the members of the Visualiza-
tion and Virtual Reality Group of the Department of
Electrical and Computer Engineering of the Univer-
sity of Patras as well as the members the Multimedia
Research Lab of the Xanthi’s Division of the ”Athena”
2
https://www.microsoft.com/en-us/hololens
3
https://https://www.holo-stylus.com
Research and Innovation Center, for their comments
and advice during the preparation of this work.
REFERENCES
Avouris, N., Katsanos, C., Tselios, N., and Moustakas, K.
(2015). Introduction to human-computer interaction.
The Kallipos Repository.
Azuma, R. T. (1997). A survey of augmented reality. Pres-
ence: Teleoperators & Virtual Environments, 6(4):355–
385.
Besan
c¸
on, L., Issartel, P., Ammi, M., and Isenberg, T. (2017).
Mouse, tactile, and tangible input for 3d manipulation.
In Proceedings of the 2017 CHI Conference on Hu-
man Factors in Computing Systems, pages 4727–4740.
ACM.
Billinghurst, M., Clark, A., Lee, G., et al. (2015). A survey
of augmented reality. Foundations and Trends
®
in
Human–Computer Interaction, 8(2-3):73–272.
Billinghurst, M., Kato, H., and Poupyrev, I. (2008). Tangible
augmented reality. ACM SIGGRAPH ASIA, 7.
Harris, C. G., Stephens, M., et al. (1988). A combined corner
and edge detector. In Alvey vision conference, volume
15.50, pages 10–5244. Citeseer.
Hernandez-Lopez, J.-J., Quintanilla-Olvera, A.-L., L
´
opez-
Ram
´
ırez, J.-L., Rangel-Butanda, F.-J., Ibarra-Manzano,
M.-A., and Almanza-Ojeda, D.-L. (2012). Detecting
objects using color and depth segmentation with kinect
sensor. Procedia Technology, 3:196–204.
Ishii, H. et al. (2008). The tangible user interface and its
evolution. Communications of the ACM, 51(6):32.
Ishii, H. and Ullmer, B. (1997). Tangible bits: towards seam-
less interfaces between people, bits and atoms. In Pro-
ceedings of the ACM SIGCHI Conference on Human
factors in computing systems, pages 234–241. ACM.
Martens, J.-B., Qi, W., Aliakseyeu, D., Kok, A. J., and van
Liere, R. (2004). Experiencing 3d interactions in vir-
tual reality and augmented reality. In Proceedings of
the 2nd European Union symposium on Ambient intel-
ligence, pages 25–28. ACM.
OpenCV (2019). Harris corner detection. https://docs.
opencv.org/master/d4/d7d/tutorial harris detector.
html.
Panagiotopoulos, T., Arvanitis, G., Moustakas, K., and Fako-
takis, N. (2017). Generation and authoring of aug-
mented reality terrains through real-time analysis of
map images. In Scandinavian Conference on Image
Analysis, pages 480–491. Springer.
Reitmayr, G., Chiu, C., Kusternig, A., Kusternig, M., and
Witzmann, H. (2005). iorb-unifying command and 3d
input for mobile augmented reality. In Proc. IEEE VR
Workshop on New Directions in 3D User Interfaces,
pages 7–10.
Shaer, O., Hornecker, E., et al. (2010). Tangible user in-
terfaces: past, present, and future directions. Founda-
tions and Trends
®
in Human–Computer Interaction,
3(1–2):4–137.
Teng, C.-H. and Peng, S.-S. (2017). Augmented-reality-
based 3d modeling system using tangible interface.
Sensors and Materials, 29(11):1545–1554.
3D Augmented Reality Tangible User Interface using Commodity Hardware
391