• constructive and technological unification of
samples and their key functional components;
• noise-resistant multi-channel means and systems
of information and control interaction and
identification;
• intelligent software and algorithmic tools that
allow for recognition of objects and the working
environment, reflexive forecasting of the
development of events, planning of rational (optimal)
behavior, and, as a consequence, adaptively
controlled functioning of special-purpose robots in
uncertain, dynamically changing heterogeneous
application conditions;
• intelligent software and algorithmic tools that
allow for the integration of different types of special-
purpose robots into a single group with subsequent
control of their joint actions in similar,
heterogeneous, and mixed combat formations;
• intelligent systems for human-machine interface
and decision support for operators controlling
special-purpose robots when solving combat (strike,
fire), support and special tasks.
Various criteria for autonomy are found in
publications, for example, the Society of Automotive
Engineers (SAE). To help automotive engineers,
governments, and insurance companies better
understand this new technology, SAE has defined six
(including no autonomy) levels of automotive
autonomy (SAE, 2023):
• Level 0: Not at all autonomous; the driver has sole
control of the vehicle.
• Level 1: One function is automated, but does not
necessarily use information about driving conditions.
A vehicle operating with simple cruise control will
qualify as Level 1.
• Level 2: Acceleration, deceleration, and steering
are automated and use sensory data from the
environment to make decisions. Modern cars with
cruise control automatic lane keeping, or collision
mitigation braking fall into this category. The driver
remains solely responsible for the safe operation of
the vehicle.
• Level 3: At this level, all safety functions are
automated, but the driver must still take control in an
emergency that the car cannot handle. An example
would be Tesla cars with the Autopilot feature
enabled. This is the most controversial level because
it requires the human driver to remain alert and
focused on the driving task even though the car is
doing most of the work. People would naturally find
this situation more tedious than simply driving a car,
and many in the autonomous vehicle community
worry that the driver's attention could be diverted
from the task at hand, leading to disastrous results.
Some automakers choose to skip Level 3 and go
straight to Level 4.
• Levels 4 and 5: These are fully autonomous levels
where the car makes all driving decisions without
human intervention. The difference is that Level 4
cars are limited to a specific set of driving scenarios,
such as city, suburban, and highway driving, while
Level 5 cars can handle any driving scenario,
including off-road driving.
A similar autonomy scale has been adopted
among drone developers (PROXIMA, 2023). There
are five levels of UAV autonomy, based on the
principles of self-driving vehicle autonomy.
• Level 0: No autonomy.
• Level 1: Some systems are automated, such as
altitude control, but a human controls the UAV.
• Level 2: Multiple simultaneous systems are
automated, but a human still controls the UAV.
• Level 3: The UAV operates autonomously under
certain conditions, but a person monitors its
movement.
• Level 4: The drone is autonomous in most
situations; a person can take over control, but this is
not necessary.
•
Level 5: The drone is completely autonomous.
Currently, the development of UAV technology is
between levels 3 and 4, where the drone can make
some decisions autonomously, but a person still needs
to observe the operation process of the device. The
main challenge in reaching level 5 is solving technical
problems and overcoming laws, regulations, and even
social acceptance in different regions.
Through the efforts of this ALFUS group, a clear
diagram has been proposed of what constitutes an
idea of the autonomy of a system and by what
indicators the autonomy of a particular system can be
assessed (ALFUS, 2004).
Autonomy indicators (sets of metrics) for a
detailed model that determines the level of autonomy
are summarized in the “space” of autonomy.
Mission complexity can be measured using
indicators such as levels of subtask completion,
decision-making and collaboration, knowledge and
perception requirements, planning and execution
efficiency, etc.
The level of human dependency can be measured
using indicators such as interaction time and
frequency, operator workload, skill levels, robot
initiation, etc.
Environmental complexity can be measured by
the size of obstacles, density and traffic, terrain types,
urban traffic characteristics, ability to recognize
friends, enemies, bystanders, etc. The detailed model
of the ALFUS framework contains the following