map with baked lights. This would remove the dou-
ble lighting and only leave the albedo. However, as
a perfect match of intensity and colour between the
physical and virtual light is difficult to obtain this op-
tion was skipped due to time and resource limitations.
Despite best efforts we experienced that it was a
difficult task to capture the physical setup and convert
it to a virtual — and maintaining the right illumina-
tion throughout the pipeline. In most cases, the ma-
jority of the pipeline has to be redone if a step fails.
Therefore, it is crucial to have a clearly defined setup
and approach of how to capture it. In the best case, no
changes are applied to the setup and hardware when
capturing the environment and the objects.
As long as marker based tracking is not stable
enough to be unnoticed the freedom of movement has
to be restricted. Optionally, the tracking might be
more stable on the expense of the frame-rate. Oth-
erwise, another tracking method can be used.
7 CONCLUSION
It is proven that it is possible to render an aug-
mented object in real-time (besides pre-rendered am-
bient shadows) which cannot be distinguished from a
real object, even when compared side by side. This
has been achieved by creating a setup to evaluate the
visual realism of augmented objects, which took into
consideration the environment and the artefacts of the
video-feed. Results showed that highlights are im-
portant for the perception of realism, as well as sil-
houettes of objects and shadows. Furthermore, it
was shown that real-time shadows can be of sufficient
quality to enhance the perception of reality. Addition-
ally, preliminary tests showed that simulation of cam-
era noise is an important factor to integrate a virtual
object.
8 FUTURE WORK
It would be of great interest to create a common
way to capture the environment and maintain the
units throughout the pipeline. With such guidelines
it would be easier to quickly set up a photo-realistic
scene, which can be used in an application.
More research is suggested evaluating other pa-
rameters, for instance colour bleeding and a larger
variety of materials and shapes. Movement and an-
imation, as well as context, could also be interest-
ing. With moving objects the influence of motion
blur could be evaluated. Also, the attention to an ob-
ject would presumably be different. When evaluating
context different sceneries and their influence on the
objects could be evaluated.
REFERENCES
Agusanto, K., Li, L., Chuangui, Z., and Sing, N. W. (2003).
Photorealistic rendering for augmented reality using
environment illumination. In Proceedings of the 2nd
IEEE/ACM International Symposium on Mixed and
Augmented Reality, ISMAR ’03, pages 208 – 216.
IEEE Computer Society.
Apiolaza, L. A. (2011). Simulating data
following a given covariance structure.
http://www.quantumforest.com/2011/10/simulating-
data-following-a-given-covariance-structure/. Last
seen: 13-03-2013.
Azuma, R. T. (1997). A survey of augmented reality.
Presence: Teleoperators and Virtual Environments 6,
4:355–385.
Borg, M., Johansen, S., Thomsen, D., and Kraus, M.
(2012). Practical implementation of a graphics tur-
ing test. In Advances in Visual Computing, volume
7432 of Lecture Notes in Computer Science, pages
305–313. Springer Berlin Heidelberg.
Bovik, A. C. (2005). Handbook of Image and Video Pro-
cessing. Elsevier, 2nd edition.
Christensen, R. H. B. (2013). Statistical method-
ology for sensory discrimination tests and
its implementation in sensR. http://cran.r-
project.org/web/packages/sensR/vignettes/
methodology.pdf. Last seen: 07-11-13.
Debevec, P. (1998). Rendering synthetic objects into real
scenes: bridging traditional and image-based graph-
ics with global illumination and high dynamic range
photography. In Proceedings of the 25th annual con-
ference on Computer graphics and interactive tech-
niques, SIGGRAPH ’98, pages 189 – 198. ACM.
Debevec, P. (2005). A median cut algorithm for light probe
sampling. In ACM SIGGRAPH 2005 Posters, SIG-
GRAPH ’05. ACM.
Elhelw, M., Nicholaou, M., Chung, A., Yang, G., and
Atkins, M. S. (2008). A gaze-based study for in-
vestigating the perception of visual realism in simu-
lated scenes. ACM Transactions on Applied Percep-
tion, 5(1):3:1 – 3:20.
Fischer, J., Bartz, D., and Straßer, W. (2006). Enhanced vi-
sual realism by incorporating camera image effects. In
ISMAR ’06, Proceedings of the 5th IEEE/ACM Inter-
national Symposium on Mixed and Augmented Real-
ity, pages 205 – 208.
Klein, G. and Murray, D. W. (2008). Compositing for
small cameras. In ISMAR ’08, Proceedings of the 7th
IEEE/ACM International Symposium on Mixed and
Augmented Reality, pages 57 – 60.
Klein, G. and Murray, D. W. (2010). Simulating low-cost
cameras for augmented reality compositing. IEEE
Transactions on Visualization and Computer Graph-
ics, 16(3):369 – 380.
PerceptualEvaluationofPhoto-realisminReal-time3DAugmentedReality
385