duce the problem, but slightly increased it.
Quite surprisingly, the virtual screen with the cam-
era image was relatively well-connected to the virtual
scene most of the time, which even slightly exceeded
expectations, especially given that the test drone cer-
tainly did not have the most accurate sensors avail-
able. Professional drone data is likely to be even more
accurate.
The further development will be primarily focused
on solving problems that were discovered during test-
ing and on other designed ideas that were not imple-
mented (e.g. the visualization of other sensor data,
the point cloud, and the completion of area boundary
visualization). Then, VR glasses with a head-tracker,
which allows natural looking around the scene, will
be connected to the application. Another such thing
is to implement a free camera and test its capabilities.
6 CONCLUSIONS
The aim of this work was to improve pilot’s orienta-
tion and to reduce his mental load during the drone
remote control. Based on research and experience,
a system has been designed that is based on aug-
mented virtuality, where on-line data from drone sen-
sors (video-stream, flight data, etc.) are integrated
into the virtual environment model. The 3D vir-
tual model consists of the data from external data
sources like topography maps, elevation maps and
3D building models. The model also includes the
user-specified planned mission information like way-
points, safe zone boundaries or flight directions.
The system architecture is designed to be scalable
to communicate with multiple drones simultaneously.
This could be useful in situations where more pilots
are simultaneously carrying out a mission and have to
work together.
The preliminary user tests proved that the pro-
posed concept and technical implementation of the
entire system improves the operator’s orientation and
navigation skills and so reducing the mental load.
More user tests are planned in future work. The pro-
fessional pilots will test the system to refine the con-
cept, to improve or include more UI elements and for
further development based on their needs.
ACKNOWLEDGEMENTS
The work was supported by Czech Ministry of Educa-
tion, Youth and Sports from the National Programme
of Sustainability (NPU II) project “IT4Innovations
excellence in science – LQ1602” and by Ministry of
the Interior of the Czech Republic project VRASSEO
(VI20172020068, Tools and methods for video and
image processing to improve effectivity of rescue and
security services operations).
REFERENCES
Calhoun, G., H. Draper, M., F Abernathy, M., Delgado, F.,
and Patzek, M. (2005). Synthetic vision system for
improving unmanned aerial vehicle operator situation
awareness. Proceedings of SPIE - The International
Society for Optical Engineering, 5802.
Cho, K., Cho, M., and Jeon, J. (2017). Fly a drone safely:
Evaluation of an embodied egocentric drone controller
interface. Interacting with Computers, 29(3):345–
354.
Devos, A., Ebeid, E., and Manoonpong, P. (2018). De-
velopment of autonomous drones for adaptive obsta-
cle avoidance in real world environments. In 2018
21st Euromicro Conference on Digital System Design
(DSD), pages 707–710.
Fern
´
andez, R. A. S., Sanchez-Lopez, J. L., Sampedro, C.,
Bavle, H., Molina, M., and Campoy, P. (2016). Nat-
ural user interfaces for human-drone multi-modal in-
teraction. In 2016 International Conference on Un-
manned Aircraft Systems (ICUAS), pages 1013–1022.
Gageik, N., Benz, P., and Montenegro, S. (2015). Obstacle
detection and collision avoidance for a uav with com-
plementary low-cost sensors. IEEE Access, 3:599–
609.
Gubcsi, G. and Zsedrovits, T. (2018). Ergonomic quad-
copter control using the leap motion controller.
In 2018 IEEE International Conference on Sens-
ing, Communication and Networking (SECON Work-
shops), pages 1–5.
Mamani, M. A. and Yanyachi, P. R. (2017). Design of com-
puter brain interface for flight control of unmanned air
vehicle using cerebral signals through headset elec-
troencephalograph. In 2017 IEEE International Con-
ference on Aerospace and Signals (INCAS), pages 1–
4.
Natarajan, K., Nguyen, T. D., and Mete, M. (2018). Hand
gesture controlled drones: An open source library.
In 2018 1st International Conference on Data Intel-
ligence and Security (ICDIS), pages 168–175.
Nourmohammadi, A., Jafari, M., and Zander, T. O. (2018).
A survey on unmanned aerial vehicle remote control
using brain–computer interface. IEEE Transactions
on Human-Machine Systems, 48(4):337–348.
Rognon, C., Mintchev, S., Dell’Agnola, F., Cherpillod,
A., Atienza, D., and Floreano, D. (2018). Fly-
jacket: An upper body soft exoskeleton for immersive
drone control. IEEE Robotics and Automation Letters,
3(3):2362–2369.
Sedlmajer, K. (2019). User interface for drone control using
augmented virtuality. Master’s thesis, Brno University
of Technology, Faculty of Information Technology.
Smolyanskiy, N. and Gonzalez-Franco, M. (2017). Stereo-
scopic first person view system for drone navigation.
Frontiers in Robotics and AI, 4.
CHIRA 2019 - 3rd International Conference on Computer-Human Interaction Research and Applications
182