Reality and using SAR techniques the proposed
system allowed multiple users to view AR images at
the same time. The proposed system used an AIP to
deliver mid-air images to users, coupled together with
a Kinect sensor for hand interaction or AR glove
technology it allowed users to physically influence
the AR image seen in mid-air. The system contained
camera that performed head tracking, allowing view
corrective rending of virtual images.
A single concept design was presented in this
paper although other designs were developed, they
are not covered, though they differ in their
implementation.
These other concept designs were intended to be
a robust system that had a large workspace. This type
of design was flawed due to its redundancy, there are
too many components in these systems that was
meant to be simple yet elegant. Furthermore, these
systems heavily relied on the code presented by
Hilliges et al. (2012) that would allow precise hand
interaction with mid-air images. The code had been
requested from the authors but there was no response.
The concept design seen in Fig 5 was created to
be a simpler version of previous designs, that was
more flexible in its method of projection, since it
allowed both layouts granted by the ASKA3D AIP.
The current design iteration (Fig 5) does make an
allowance for a Kinect sensor and a web-camera to be
mounted on the system. This system (Fig 5) was
created as a desktop system that did not require a
dedicated computer to operate the system but rather a
laptop. This would make it highly accessible to
everyday consumers wanting to experience AR
technology. User interaction with this system (Fig 5)
did not come through direct hand interaction but
through glove interaction where users were able to
influence the AR image, if they were wearing AR
glove technology. CaptoGlove was selected as the
AR glove technology to use for this system. There
was an allowance for a camera to be positioned on the
system to allow view dependant rendering.
This research covered a single part of a whole
system, which was the physical hardware that will be
required to implement a Collaborative SAR system.
The concept design meets the desired goal of the
system and will be the design used when building the
final system. The concept will need to be redesigned
after further review before it can be complete.
Since the method of projection relied on whether
the ASKA3D plate performed mid-air projection as
intended a test structure was created to test the mid-
air image projection (Fig 6). The image projection
delivered far exceeded what was expected. Since it
was possible to deliver a mid-air image an experiment
was created to evaluate the quality of experience
granted by the projection technique. The
experimented used for testing was based on the QoE
evaluation framework created by Zhang et al. (2018).
The experiment allowed users to view and interact
with the mid-air images and evaluate their
experiences through a questionnaire they had to
complete. The answered questionnaires can be found
by following the link provided: https://github.com/
Dashlen/Questionnair-results-for-SAR-System/issu
es /1#issue-564724513.
The data was then tabulated (Table 1) and graph
showing the average percentage rating for system
properties was created (Fig 7). The feedback from
users showed an average rating of 80% for the
Content Quality. The high percentage of this result
showed that based on user evaluation the images
perceived by users were realistic and did not require
intense focus from users to observe the mid-air
images.
The average rating for Hardware Quality,
concerning user mobility and comfort, was 68%.
Users found the experience both physically and
visually comfortable, since no headset was required,
furthermore no eye soreness was reported. The reason
behind the moderate percentage rating was due to the
limited visual freedom granted by the system. This
may be attributed to the users’ exiting the viewing
angle of the system, the size of the ASKA3D plate
and how far it is situated from the LCD screen.
The average rating for Environment Understanding
was 60%. While the system was able to deliver images
that could fit any environment the images projected
could not interact with foreign objects, any interaction
with physical objects would result in the mid-air image
losing its holographic effect on the users.
User Interaction was given a low average rating at
36%, since the users were unable to control the image
they were viewing. As a result, they could only rate
the interaction granted by the system as “very bad”.
Originally, a software was designed to be used on the
system that would allow users to change the scene of
the object they were observing, but the projection of
this scene was too big to be projected correctly. One
of the users concluded that user interaction with
regards to how precise and how fast the system
responds to user input would not depend on the
ASKA3D plate but rather the LCD screen being used
and how good a response time and refresh rate it had.
After further consideration their statement was found
to be correct.
The overall experience rating was given as 69.4%,
this rating was given by the users when they
considered the entire experience granted by the mid-