annotation to implement. Following the recent work
of (Keil et al., 2018), our class diagram differentiates
between “Label”, “Icon”, “Highlighted”, “X-ray”,
“Aids / Indicator / Guide”, “Explosion diagram” or
“Transmedia material”. All these objects inherit from
the “Content” class, which requires developers to
define a set of key-value pairs. In our data model, this
is defined by a set of “Property” objects within the
“Content” class. This “Property” class, in addition to
the id, name and type attributes, has a set of “Value”
objects, with their respective id and value attributes.
In this way, any type of annotation can be
implemented based on a list of properties.
6 CONCLUSION AND FUTURE
WORK
This work presents a study and characterization of
AR annotations from the point of view of software
engineering. To do this, different works have been
analyzed that theorize, in general, on virtual elements
in AR or, in particular, on AR annotations. After the
analysis of the existing literature, a taxonomy of the
AR annotations has been obtained, which proposes to
classify the characteristics of these virtual elements
around four axes: content, location, temporality and
interaction. This has been done based on a generic
definition of AR annotation that encompass all virtual
element that meets the requirement proposed in
(Wither et al., 2009): having a spatially dependent
component and a spatially independent component.
This taxonomy has allowed us to propose a data
model capable of supporting any type of AR
annotation, regardless of the hardware used. After this
first model proposal, the next step will be to
implement a system based on a more detailed version
of the class diagram proposed and perform the
relevant tests to perfect it. Thanks to this, we could
offer a final solution to the incompatibility problem
of AR annotation systems. Due to the increase in
applications that make use of AR annotations, we
believe that having a common framework is of great
importance as it facilitates the work of developers and
offers users greater transversality when interacting
with different types of AR annotations.
ACKNOWLEDGEMENTS
I.G-P acknowledges the Spanish Ministry of Science,
Innovation and Universities (program: “University
Teacher Formation”) to carry out this study.
REFERENCES
Bruno, F., Barbieri, L., Marino, E., Muzzupappa, M.,
D’Oriano, L., & Colacino, B. (2019). An augmented
reality tool to detect and annotate design variations in
an Industry 4.0 approach. The International Journal of
Advanced Manufacturing Technology.
Caudell, T. P., & Mizell, D. W. (1992). Augmented reality:
An application of heads-up display technology to
manual manufacturing processes. Proceedings of the
Twenty-Fifth Hawaii International Conference on
System Sciences, 659–669 vol.2.
Chang, Y. S., Nuernberger, B., Luan, B., & Höllerer, T.
(2017). Evaluating gesture-based augmented reality
annotation. 2017 IEEE Symposium on 3D User
Interfaces (3DUI), 182–185.
Feiner, S., MacIntyre, B., & Seligmann, D. D. (1992).
Annotating the real world with knowledge-based
graphics on a see-through head-mounted display.
Furness, T. A. (1986). The Super Cockpit and its Human
Factors Challenges. Proceedings of the Human Factors
Society Annual Meeting, 30(1), 48–52.
García-Pereira, I., Gimeno, J., Pérez, M., Portalés, C., &
Casas, S. (2018). MIME: A Mixed-Space Collaborative
System with Three Immersion Levels and Multiple
Users. 2018 IEEE International Symposium on Mixed
and Augmented Reality Adjunct (ISMAR), 179–183.
Grønbæk, K., Hem, J. A., Madsen, O. L., & Sloth, L. (1994).
Cooperative hypermedia systems: A Dexter-based
architecture. Communications of the ACM, 37(2), 64–74.
Hansen, F. A. (2006). Ubiquitous Annotation Systems:
Technologies and Challenges. Proceedings of the
Seventeenth Conference on Hypertext and Hypermedia,
121–132.
Kahan, J., & Koivunen, M.-R. (2001). Annotea: An Open
RDF Infrastructure for Shared Web Annotations.
Proceedings of the 10th International Conference on
World Wide Web, 623–632.
Keil, J., Schmitt, F., Engelke, T., Graf, H., & Olbrich, M.
(2018). Augmented Reality Views: Discussing the
Utility of Visual Elements by Mediation Means in
Industrial AR from a Design Perspective. In J. Y. C.
Chen & G. Fragomeni (Eds.), Virtual, Augmented and
Mixed Reality: Applications in Health, Cultural
Heritage, and Industry (pp. 298–312).
Müller, T. (2019). Challenges in representing information
with augmented reality to support manual procedural
tasks. ElectrEng 2019, Vol. 3, Pages 71-97.
Rekimoto, J., & Nagao, K. (1995). The World Through the
Computer: Computer Augmented Interaction with Real
World Environments. Proceedings of the 8th Annual
ACM Symposium on User Interface and Software
Technology, 29–36.
Tönnis, M., Plecher, D. A., & Klinker, G. (2013).
Representing information – Classifying the Augmented
Reality presentation space. Computers & Graphics,
37(8), 997–1011.
Wither, J., DiVerdi, S., & Höllerer, T. (2009). Annotation
in outdoor augmented reality. Computers & Graphics,
33(6), 679–689.