
ACKNOWLEDGEMENTS
This material is based upon work supported in part
by the National Science Foundation under Grants No.
2219842 and 2318657. We thank Shuqi Liao for her
help with the experiments.
REFERENCES
Andersen, D., Popescu, V., Cabrera, M. E., Shanghavi,
A., Gomez, G., Marley, S., Mullis, B., and Wachs,
J. (2016a). Virtual annotations of the surgical field
through an augmented reality transparent display. The
Visual Computer, 32(11):1481–1498.
Andersen, D., Popescu, V., Lin, C., Cabrera, M. E., Shang-
havi, A., and Wachs, J. (2016b). A hand-held, self-
contained simulated transparent display. In 2016 IEEE
International Symposium on Mixed and Augmented
Reality (ISMAR-Adjunct), pages 96–101.
Babic, T., Perteneder, F., Reiterer, H., and Haller, M.
(2020). Simo: Interactions with distant displays by
smartphones with simultaneous face and world track-
ing. In Extended Abstracts of the 2020 CHI Confer-
ence on Human Factors in Computing Systems, pages
1–12.
Bari
ˇ
cevi
´
c, D., H
¨
ollerer, T., Sen, P., and Turk, M. (2017).
User-perspective ar magic lens from gradient-based
ibr and semi-dense stereo. IEEE Transactions on Visu-
alization and Computer Graphics, 23(7):1838–1851.
Bauer, M., Kortuem, G., and Segall, Z. (1999). ” where
are you pointing at?” a study of remote collaboration
in a wearable videoconference system. In Digest of
Papers. Third International Symposium on Wearable
Computers, pages 151–158. IEEE.
Billinghurst, M., Kato, H., Kiyokawa, K., Belcher, D., and
Poupyrev, I. (2002). Experiments with face-to-face
collaborative ar interfaces. Virtual Reality, 6(3):107–
121.
Boring, S., Baur, D., Butz, A., Gustafson, S., and Baudisch,
P. (2010). Touch projector: mobile interaction through
video. In Proceedings of the SIGCHI Conference on
Human Factors in Computing Systems, pages 2287–
2296.
Bork, F., Schnelzer, C., Eck, U., and Navab, N. (2018).
Towards efficient visual guidance in limited field-of-
view head-mounted displays. IEEE Transactions on
Visualization and Computer Graphics, 24(11):2983–
2992.
Borsoi, R. A. and Costa, G. H. (2018). On the perfor-
mance and implementation of parallax free video see-
through displays. IEEE Transactions on Visualization
and Computer Graphics, 24(6):2011–2022.
Bradski, G. (2000). The OpenCV Library. Dr. Dobb’s Jour-
nal of Software Tools.
ˇ
Copi
ˇ
c Pucihar, K., Coulton, P., and Alexander, J. (2013).
Evaluating dual-view perceptual issues in handheld
augmented reality: device vs. user perspective ren-
dering. In Proceedings of the 15th ACM on Inter-
national conference on multimodal interaction, pages
381–388.
Davison, A. J., Reid, I. D., Molton, N. D., and Stasse, O.
(2007). Monoslam: Real-time single camera slam.
IEEE transactions on pattern analysis and machine
intelligence, 29(6):1052–1067.
Gauglitz, S., Lee, C., Turk, M., and H
¨
ollerer, T. (2012).
Integrating the physical environment into mobile re-
mote collaboration. In Proceedings of the 14th inter-
national conference on Human-computer interaction
with mobile devices and services, pages 241–250.
Gauglitz, S., Nuernberger, B., Turk, M., and H
¨
ollerer, T.
(2014). World-stabilized annotations and virtual scene
navigation for remote collaboration. In Proceedings
of the 27th annual ACM symposium on User interface
software and technology, pages 449–459.
Hill, A., Schiefer, J., Wilson, J., Davidson, B., Gandy, M.,
and MacIntyre, B. (2011). Virtual transparency: In-
troducing parallax view into video see-through ar. In
2011 10th IEEE International Symposium on Mixed
and Augmented Reality, pages 239–240.
Irlitti, A., Smith, R. T., Von Itzstein, S., Billinghurst, M.,
and Thomas, B. H. (2016). Challenges for asyn-
chronous collaboration in augmented reality. In 2016
IEEE International Symposium on Mixed and Aug-
mented Reality (ISMAR-Adjunct), pages 31–35. IEEE.
Kaufmann, H. (2003). Collaborative augmented reality in
education. Institute of Software Technology and In-
teractive Systems, Vienna University of Technology,
pages 2–4.
Lin, T.-H., Liu, C.-H., Tsai, M.-H., and Kang, S.-C. (2015).
Using augmented reality in a multiscreen environment
for construction discussion. Journal of Computing in
Civil Engineering, 29(6):04014088.
Lowe, D. G. (2004). Distinctive image features from scale-
invariant keypoints. International journal of computer
vision, 60(2):91–110.
Mohr, P., Tatzgern, M., Grubert, J., Schmalstieg, D., and
Kalkofen, D. (2017). Adaptive user perspective ren-
dering for handheld augmented reality. In 2017 IEEE
Symposium on 3D User Interfaces (3DUI), pages
176–181. IEEE.
Samini, A. and Palmerius, K. (2014). A perspective geom-
etry approach to user-perspective rendering in hand-
held video see-through augmented reality.
S
¨
or
¨
os, G., Seichter, H., Rautek, P., and Gr
¨
oller, E. (2011).
Augmented visualization with natural feature track-
ing. In Proceedings of the 10th International Con-
ference on Mobile and Ubiquitous Multimedia, pages
4–12.
Tomioka, M., Ikeda, S., and Sato, K. (2013). Approximated
user-perspective rendering in tablet-based augmented
reality. In 2013 IEEE International Symposium on
Mixed and Augmented Reality (ISMAR), pages 21–28.
Zhang, E., Saito, H., and de Sorbier, F. (2013). From
smartphone to virtual window. In 2013 IEEE Inter-
national Conference on Multimedia and Expo Work-
shops (ICMEW), pages 1–6.
HUCAPP 2024 - 8th International Conference on Human Computer Interaction Theory and Applications
434