Drone Technology for Efficient Warehouse Product Localization
Assia Belbachir
1 a
, Antonio M. Ortiz
1 b
, Erik T. Hauge
1
, Ahmed Nabil Belbachir
1 c
,
Giusy Bonanno
2
, Emanuele Ciccia
2
and Giorgio Felline
2
1
NORCE Norwegian Research Center, Grimstad, Norway
2
ABS Acciaierie Bertoli Safau, Italy
{assb, aort, ehau, nabe}@norceresearch.no,
{
Keywords:
Warehouse, Self-Products Positioning, Drone.
Abstract:
This paper presents a novel drone-based strategy for enhancing stock-monitoring systems, specifically focus-
ing on the accurate localization of products within defined areas. Traditional localization techniques, which
are often reliant on technologies such as RFID or precision positioning systems, face substantial limitations
in terms of accuracy and operational efficiency. To address these issues, we introduce an advanced relative
positioning system, uniquely designed to identify and accurately position steel bars relative to each other in an
outdoor warehouse environment. The developed approach significantly improves localization precision and
speed over conventional methods. Our analysis includes an evaluation of the system’s performance, demon-
strating advancements in self-localization capabilities. Results indicate a marked enhancement in the accuracy
and efficiency of stock monitoring, showcasing the system’s potential applicability to a diverse range of prod-
ucts and environments.
1 INTRODUCTION
Ensuring accurate product tracking within industrial
environments is important for real-time inventory
management and operational efficiency. This paper
presents a novel approach highlighting the evolution
of product positioning methodologies by using com-
puting relative position. In our specific scenario, we
deal with stacks of steel bars located outdoors, each
stack is identifiable by a unique marker (See Fig-
ure 1). These steel bars undergo movement facilitated
by forklifts equipped with powerful magnets, leading
to positional shifts and interference. The dynamic na-
ture of this environment needs constant human inter-
vention for manual scanning, resulting in significant
time consumption.
Several technologies are used for products track-
ing in industry. Radio-Frequency Identification
(RFID), known for its non-line-of-sight data trans-
mission capabilities, has been a pioneer, enabling
efficient product tracking within diverse environ-
ments (Konsynski and Smith, 2003). However, chal-
a
https://orcid.org/0000-0002-1294-8478
b
https://orcid.org/0000-0002-7145-8241
c
https://orcid.org/0000-0001-9233-3723
lenges arise in environments that are in metal or with
interference, leading to inaccuracies (Curtin et al.,
2007).
Similarly, bar-code systems and QR
codes (de Seta, 2023), offer cost-effective solutions
but need direct line-of-sight, posing labor-intensive
challenges in expansive settings. Advanced systems
such as Real-Time Location Systems (RTLS) and
GPS-based tracking have been investigated for out-
door or large-scale settings (R
´
acz-Szab
´
o et al., 2020),
however they often lack the precision demanded
by industrial standards. Traditional methods are
effective, however when dealing with outdoor envi-
ronments, the efficiency is reduced. The integration
of unmanned aerial vehicles (UAVs) into warehouse
management is a solution for outdoor storage and
product tracking practices. The authors in (Cristiani
et al., 2020) explore a multi-robot system using
micro-drones with embedded cameras, drastically
reducing warehouse management time by using
drones’ capacity in navigating indoor shelves and
aisles.
Similarly, the work in (Malang et al., 2023) of-
fers an exhaustive review of UAV utilization in ware-
house management, identifying key factors influenc-
ing drone use and showing their potential applications
Belbachir, A., Ortiz, A., Hauge, E., Belbachir, A., Bonanno, G., Ciccia, E. and Felline, G.
Drone Technology for Efficient Warehouse Product Localization.
DOI: 10.5220/0012947900003822
Paper published under CC license (CC BY-NC-ND 4.0)
In Proceedings of the 21st International Conference on Informatics in Control, Automation and Robotics (ICINCO 2024) - Volume 2, pages 357-364
ISBN: 978-989-758-717-7; ISSN: 2184-2809
Proceedings Copyright © 2024 by SCITEPRESS – Science and Technology Publications, Lda.
357
Figure 1: Illustration of the products position with QRcode
tags.
in inventory management, intra-logistics, and surveil-
lance in smart warehouses.
However, a critical challenge persists in ensuring
uninterrupted drone tracking and accurate product po-
sition inference. Existing techniques often rely on
fixed infrastructure or external markers, unsuitable for
agile and rapidly changing environments like outdoor
storage yards or manufacturing facilities, as explained
in our case.
In this paper, we present a novel system for rela-
tive positioning utilizing images captured by drones.
This system represents a fundamental change in prod-
uct positioning, as it calculates item locations relative
to one another, rather than depending on fixed points
or external infrastructure. By doing so, it establishes
a precise mechanism to automatically identify prod-
ucts’ locations for autonomous warehouse manage-
ment operations. Our method offers an increased ac-
curacy and efficiency within complex industrial en-
vironments. Specifically for environments such as
warehouses and manufacturing floors, where conven-
tional systems often fall short in providing accurate
and reliable information, our approach stands out for
its adaptability without the need for exterior mark-
ers or infrastructure and shows a outstanding perfor-
mance.
The remainder of this paper is organized as fol-
lows: Section 2 reviews the related work on sys-
tem positioning. Section 3 describes the proposed
framework, as well as the main concepts involved in
this work. Implementation and empirical results of
the proposed system are detailed in Section 4, while
Section 5 summarizes our contributions and sketches
some of our future perspectives.
2 STATE OF THE ART
The PILOT system, as detailed in (Famili et al.,
2023), is an in indoor drone localization through its
utilization of Time of Arrival (ToA) analysis of ultra-
sound signals. This approach tackles the complexity
of indoor environments, such as the multi-path fad-
ing. Moreover, the integration of Frequency Hopping
Spread Spectrum (FHSS) technology showcases in-
novative strategies to enhance location estimation ac-
curacy, although specific precision metrics are not ex-
plicitly outlined in the literature.
In outdoor environments, the fusion of the Global
Navigation Satellite System (GNSS) with compass-
based systems, as highlighted by Flavia et al. in
(Causa and Fasano, 2021), has markedly enhanced
autonomous drone navigation. Although GNSS-
based approaches have gained precision improve-
ments, they may still lack in accuracy for locating
products stored in densely packed piles.
The employment of stereo vision techniques for
indoor drone control, as elucidated by Anand et al. in
(George et al., 2023), represents a significant stride
in indoor positioning systems. This method empha-
sizes 3D reconstruction through drone-mounted cam-
eras, boasting high positional accuracy, especially in
aligning the drone’s yaw rotation with the virtual cam-
era. However, precise precision figures remain absent
from existing summaries.
Other approaches, such as Sensor Fusion, in-
tegrate ultrasound, LIDAR Time of Flight (ToF)
rangefinders, visual odometers, and Ultra-Wide Band
(UWB) positioning (Xu et al., 2018), promising ap-
proximately 5 cm accuracy during flight.
Furthermore, UWB Sensing for Indoor Preci-
sion introduces a system employing impulse-radio
ultra-wideband (IR-UWB) two-way ranging (TWR),
achieving high precision and interference resilience.
With a reported standard deviation of 1.2 cm for
single-measurement TWR in semi-closed environ-
ments, it holds particular significance for demanding
indoor applications.
Building upon these advancements, this paper in-
troduces a new approach for autonomous warehouse
management, focusing on accurately identifying the
relative positions of products items in dynamic out-
door environments. By combining drone-based image
capture with a high-speed processing algorithm sup-
ported by a trustworthiness score, this system ensures
precise identification of each item’s location. Addi-
tionally, it provides a user-friendly visualization inter-
face to facilitate product localization in outdoor ware-
houses.
ICINCO 2024 - 21st International Conference on Informatics in Control, Automation and Robotics
358
3 PRODUCTS POSITIONING AND
TRUSTWORTHINESS SYSTEM
FRAMEWORK
In this paper, we present a framework designed to
detect and position products utilizing an embedded
camera on a drone, augmented with a trustworthiness
scoring mechanism. Figure 2 illustrates the utiliza-
tion of this framework in a smart outdoor warehouse
environment.
Drone Integration: The framework uses infor-
mation coming form a drone equipped with a
camera to gather visual data for product detection
and localization at predefined waypoints within
the warehouse.
Figure 2: Schematic representation of the developed system
in operation within a smart outdoor warehouse.
Smart Outdoor Warehouse Operation: Upon
capturing visual data, the framework extracts
product IDs from each image, represented by QR
codes (see Figure 1). Then, it computes rela-
tive product positions and assigns trustworthiness
scores based on contextual information. These
data are then stored in the database for further
analysis and visualization.
Database Management: The framework’s
database contains comprehensive product in-
formation, including unique IDs (QR codes),
compositions, dimensions, and relative positions
(e.g., left, right, top, down).
Products Location Visualization: An intuitive
interface is developed to update product locations
and facilitate visualization, ensuring seamless in-
tegration with existing warehouse management
and Enterprise Resource Planning (ERP) tools to
optimize picking products.
Additionally, the framework provides open inter-
faces for easy integration into existing warehouse
management and ERP systems, thereby offering a
valuable tool for production and distribution plan-
ning. It contributes to resource optimization and en-
hances industrial processes.
The subsequent subsection details the algorithm
for determining relative product positions, followed
by an explanation of the trustworthiness scoring com-
putation.
3.1 Relative Product Position
This subsection focuses on determining the relative
positions of products within captured images. It in-
volves detecting the IDs of existing products in each
image and computing their relative positions. The
framework defines eight possible relations between
products: up, down, left, right, up-right, up-left,
down-right, and down-left. These relations allow
for understanding the spatial arrangement of products
within the images. The algorithm employed for this
task involves computing the central position of de-
tected corners from QR codes and determining the
relative positions based on the computed coordinates.
Algorithm 1 takes as input the positions of two
points (pos1 and pos2) and an optional threshold
value. The threshold represents the sensor error. It
computes the relative direction between these two
points based on their coordinates. It extracts the x
and y coordinates of pos1 and pos2. It calculates the
differences in x and y coordinates between pos1 and
pos2, storing them in variables dx and dy. Then, it
checks if both |dx| and |dy| are less than the thresh-
old. If so, sets direction to ’same’, indicating that the
points are at the same position. If |dx| > |dy|, com-
pares the sign of dx. If dx is positive, sets direction to
right’; otherwise, sets it to ’left’. If |dy| exceeds the
threshold, appends the direction with down’ if dy is
positive, or ’up’ if dy is negative and vise-versa. Fi-
nally, the algorithm returns the computed ‘direction´
as the relative position between pos1 and pos2.
By accurately determining the relative positions
of products, the framework facilitates tasks such as
inventory management and product localization in in-
dustrial settings.
3.2 Trust-Ability Computation
This subsection discusses the computation of trust
relationships among products based on their rela-
tive positions. Trust relationships are essential for
minimizing errors in the identification and localiza-
tion processes. The framework defines a “directed
trust graph´´ where nodes represent product IDs and
edges represent trust relationships between products’
relative positions. The directed trust graph G =
Drone Technology for Efficient Warehouse Product Localization
359
Algorithm 1: determine relation.
1: Input: pos1, pos2, threshold=10
2: x1, y1 pos1
3: x2, y2 pos2
4: dx, dy x2 x1, y2 y1
5: direction None
6: if |dx|¡threshold and |dy|¡threshold then
7: direction ‘same’
8: else if |dx| > |dy| then
9: if dx > 0 then
10: direction ‘right’
11: if |dy| > threshold then
12: direction
‘right-’ + (dy > 0 ‘down’ else ‘up’)
13: end if
14: else
15: direction ‘left’
16: if |dy| threshold then
17: direction
‘left-’ + (dy > 0 ‘down’ else ‘up’)
18: end if
19: end if
20: else
21: if |dx| > threshold then
22: if dx > 0 then
23: direction ‘right’
24: else
25: direction ‘left’
26: end if
27: else
28: if dy > 0 then
29: direction ‘down’
30: else
31: direction ‘up’
32: end if
33: end if
34: end if
35: Return direction
(V, E, R, φ), where node set V represents product ID
and edge set E represents trust relationships among
products’ relative position. The trust graph enumer-
ates different types of trust relationships R. The map-
ping function φ : E = R maps the observed edges
to trust relationship types, so each edge strictly cor-
responds to a specific trust relationship. Moreover,
the trustworthiness varies for different application do-
mains.
In our case, the trust relations between nodes have
eight types, i.e., R = {left, right, up, down, up-left,
down-left, up-right, down-right}. Other scenarios or
applications can require a different number of rela-
tions, and the trust-ability graph would automatically
adapt.
Trust Evaluation. The trust evaluation task is
to predict the unobserved trust relationships in a
trust graph G. Specifically, given a trust graph G =
(V, E, R, φ), the goal is to trust more cumulative infor-
mation that are coming from the other direction. For
example if a node A is telling that B is on my right and
that B is saying that A is on my left, the trust value is
highly increased. Note that trust relationships are di-
rected, and the trust relationship from node u to node
v is not equal to the relationship from node v to node
u, but it allows to increase the trust-ability in case that
they express the same semantic of relation.
Figure 3 shows an instance of trust-ability com-
putation for node A. Initially, all relations of node A
are portrayed in Figure 3(a), each starting with a trust-
ability score of zero (including Right, Up-right, Right,
Down-right, Down, Down-left, Left, and Up-left).
In Figure 3(b), let’s consider the computed rela-
tion being Up-right with node B, resulting in an incre-
ment of one to the trust-ability score for the Up-right
edge of A. Now, suppose the subsequent computed
relation for node A indicates that B is to its Right.
Following the same process, the trust-ability score of
the Right relation of node A increases by 1. Conse-
quently, both relations, Up-right and Right, pertaining
to node A exhibit identical trust-ability scores of one
for the same node B.
Continuing, let’s imagine that in another compu-
tation, B concludes that A is positioned to its left
(as depicted in Figure 3(d)). Consequently, A up-
dates its trust-ability score to double, now standing at
two, while simultaneously decreasing the trust-ability
score of the Up-right link with B by one. At this point,
the maintained relation between A and B is Right with
a trust-ability of two.
Upon completion of this process for all relations
of node A, only one relation includes node B: the re-
lation with the highest trust-ability score is selected to
determine the link.
The final generated graph is a connected graph
(see Algorithm 2).
The algorithm for trust-ability computation eval-
uates trust relationships based on observed edges in
the trust graph, considering factors such as product
IDs, relative positions, and trust scores. By comput-
ing trust relationships, the framework enhances the re-
liability of product localization and warehouse man-
agement systems.
3.3 Video Capture Loop and
Trust-Ability Score
This subsection describes the operational execution
of the framework’s video capture process and the
ICINCO 2024 - 21st International Conference on Informatics in Control, Automation and Robotics
360
A
Up
U
p
-
right
Right
Down
-
r
ight
Down
Down left
Left
Up
-
left
A
Up
U
p
-
right
Right
Down
-
r
ight
Down
Down left
Left
Up
-
left
B
1
A
Up
U
p
-
right
Right
Down
-
r
ight
Down
Down left
Left
Up
-
left
B
1
1
A
Up
U
p
-
right
Right
Down
-
r
ight
Down
Down left
Left
Up
-
left
B
2
0
(a)
(b)
(c)
(d)
B
Figure 3: Illustration of an example of trust-ability update.
Algorithm 2: Trust-ability Computation.
1: function COMPUTE TRUST-
ABILITY(all relationships)
2: combined trust {}
3: for all relation in all relationships do
4: qr id relation[
qr id
]
5: neighbor id relation[‘neighbor id
]
6: trust relation[‘trust
]
7: inverse rel
inverse relation(relation[
relation
])
8: key (min(qr id, neighbor id),
max(qr id, neighbor id))
9: if key not in combined trust then
10: combined trust[key] {trust
:
0, relations
: []}
11: end if
12: combined trust[key][trust
] += trust
13: end for
14: return bound trust(combined trust)
15: end function
computation of trust-ability scores in real-time. The
video capture loop continuously captures frames from
a drone-mounted camera, pre-processes them to en-
hance QR code visibility, and detects QR codes within
the frames. For each pair of detected QR codes, the
framework computes their relative positions and trust
scores based on the computed relations. It visualizes
the trust graph, incorporating both positions and trust
scores, to provide a comprehensive understanding of
spatial interactions between products. By perform-
ing these tasks in real-time, the framework enables
efficient monitoring and management of products in
industrial environments, enhancing productivity and
accuracy in inventory-related tasks.
Algorithm 3: Video Capture Loop and trust-ability score.
1: while drone is operational do
2: Capture video frame
3: if frame is not empty then
4: preprocessed frame preprocess im-
age(frame)
5: corners, data detect qr -
codes(preprocessed frame)
6: for all QR codes (corner i, qr id i) do
7: pos i get position(corner i)
8: for all QR codes (corner j, qr id j) do
9: pos j get position(corner j)
10: relation determine rela-
tion(pos i, pos j)
11: trust compute trust(qr id i,
qr id j, relation, frame)
12: visualize graph trust(graph, pos)
13: end for
14: end for
15: end if
16: end while
4 EXPERIMENTAL RESULTS
In our experiments, we used a drone equipped with
a camera to capture real-time videos in two types
of scenarios: indoor and outdoor warehouse environ-
ments. The drone was flown over waypoints, captur-
ing video sequences from which QR codes were ex-
tracted. To pre-process the captured frames, we uti-
lized the preprocess image() function (see Algorithm
3), which converted the RGB images to grayscale and
applied adaptive thresholding to enhance the visibil-
ity of the QR codes. Subsequently, the detect qr -
codes() function was employed to identify QR codes
within each frame, returning their relative positions
and the contents of the information (ID). Then, a di-
rected graph was generated based on the detected re-
lations, and a trust-ability score was computed.
4.1 Indoor Obtained Results
Figure 4 represents from top to down (1) one test en-
vironment, (2) the generated trust-ability graph and
(3) the user interface for products details information.
The graph was annotated with trust scores, show-
casing the level of confidence in each relationship.
Trust scores were computed using the compute trust
function, taking into account neighbor relative posi-
tion.
Drone Technology for Efficient Warehouse Product Localization
361
Table 1: Obtained results using different number of products in indoor environment.
Number of Products Detected Products Position Accuracy (%) Trust-ability (max 5)
5 5 100 4
22 18 94 4
30 25 94 3.8
43 36 94 3.8
90 78 94 3.8
Figure 4: Representation of an example of products’ image,
the generated graph with the computed trust-ability score
for each arrow and the warehouse user visualization.
Table 1 provides an overview of the performance
metrics across different scenarios. The Number of
Products column indicates the varying number of
products present in the environment during each test.
Detected Products column represents the number of
detected QR codes.
Position Accuracy represents the products that are
detected and located in the correct position within an
error of 2 cm. More precisely, it represents the prod-
ucts that are correctly located from those that have
been detected. Trust-ability is crucial for understand-
ing the reliability of relationships inferred from the
detected QR codes, which will be computed using
Algorithm 2. We choose for our experiments that
the trust-ability values range from 0 to 5, with higher
scores indicating greater reliability.
The obtained indoor results show that more than
81% of the products where detected with an accu-
Figure 5: Top view of the outdoor testing environment.
racy of positioning the products of not less than 94%.
The trust-ability is higher than 3, which means that
the majority of the relations between the nodes can be
trusted.
4.2 Outdoor Obtained Results
Figure 5 illustrates the outdoor testing environment,
showcasing the algorithm’s real-world applicability.
The objective was to demonstrate the effectiveness
of the proposed algorithm in locating products by de-
tecting QR codes and positioning products within the
captured frames in outdoor environment. Due to lo-
gistical constraints, the number of outdoor products is
lower compared to the indoor products.
Table 2 provides an overview of the performance
metrics across different scenarios.
We choose for our experiments that the trust-
ability values range from 0 to 5, with higher scores
indicating greater reliability. We need to mention that
each line of one specific Number of product represents
an average of all different perceived drone videos.
Table 2 shows that the developed approach main-
tained a high level of accuracy in detecting QR codes
even in outdoor environments. However, a small de-
crease in position accuracy was observed with an in-
creasing number of products. This decrease can be
attributed to factors such as increased complexity in
distinguishing QR codes due to environmental factors
like lighting variations, shadows, and potential occlu-
sions.
Despite the decrease in accuracy with a greater
number of product densities, the algorithm’s robust-
ness and adaptability in outdoor environments are
ICINCO 2024 - 21st International Conference on Informatics in Control, Automation and Robotics
362
Table 2: Statistical obtained results using different number of products in outdoor environment.
Number of Products Detected Products Position Accuracy (%) Trust-ability (max 5)
5 5 100 4
11 10 90 3.8
21 16.8 79 3.2
Figure 6: Representation of an example of products’ image,
the generated graph with the computed trust-ability score
for each arrow and the warehouse user visualization.
promising.
The drone successfully continuously captured
videos, and the algorithm demonstrated consistent
performance in detecting and positioning QR codes
without any external positioning system.
In conclusion, our experiments validated the ef-
fectiveness of the proposed algorithm, highlighting
its application in product location for the warehouse
where accurate QR code detection and relative prod-
uct position help improve warehouse management.
5 CONCLUSION AND FUTURE
WORK
In conclusion, this paper has addressed the important
need for automating the monitoring of product posi-
tions within outdoor industrial environments, empha-
sizing the challenges and advancements in the field.
The evolution of product positioning techniques has
been explored, with a focus on overcoming limita-
tions faced by traditional methods such as RFID, bar-
code systems, and GPS-based tracking.
The integration of unmanned aerial vehicles
(UAVs) into warehouse management has been iden-
tified as a new generation solution for large-scale and
outdoor environments.
Recognizing the challenges in continuous drone
tracking and accurate product inference, this paper
introduces a novel system of relative positioning us-
ing drone-captured images. This innovative approach
calculates the location of products in relation to each
other, revolutionizing traditional fixed-point and ex-
ternal infrastructure systems. Grounded in recent ad-
vancements in sensor technology and data process-
ing algorithms, this method offers enhanced accu-
racy and efficiency, particularly suited for complex in-
dustrial environments like large (outdoor) warehouses
and manufacturing floors.
The proposed framework has been detailed, em-
phasizing the use of embedded cameras in drones, a
smart outdoor warehouse, a comprehensive database,
and a product location visualization interface. The
algorithm for relative product positioning, involving
QR code detection and trust-ability graph computa-
tion, has been presented in detail.
Experimental results demonstrate the algorithm’s
effectiveness in indoor settings, showcasing its adapt-
ability and robustness in various environments. For
outdoor locations, the system also achieved a good
accuracy in detecting products location.
In summary, this work contributes to the evolv-
ing landscape of industrial product positioning by pre-
senting an innovative solution that uses drone technol-
ogy and advanced algorithms. The proposed system
has the potential to significantly enhance inventory
management and operational workflows in diverse in-
Drone Technology for Efficient Warehouse Product Localization
363
dustrial settings. Future perspectives involve further
refining the algorithm for a better detection, expand-
ing the system’s capabilities for other mission such as
products stability, and exploring applications in other
domains.
ACKNOWLEDGEMENT
The COGNIMAN project
1
, leading to this paper, has
received funding from the European Union’s Hori-
zon Europe research and innovation programme un-
der grant agreement No 101058477.
REFERENCES
Causa, F. and Fasano, G. (2021). Improving navigation
in gnss-challenging environments: Multi-uas coop-
eration and generalized dilution of precision. IEEE
Transactions on Aerospace and Electronic Systems,
57(3):1462–1479.
Cristiani, D., Bottonelli, F., Trotta, A., and Di Felice,
M. (2020). Inventory management through mini-
drones: Architecture and proof-of-concept implemen-
tation. In 2020 IEEE 21st International Symposium
on ”A World of Wireless, Mobile and Multimedia Net-
works” (WoWMoM), pages 317–322.
Curtin, J., Kauffman, R. J., and Riggins, F. J. (2007). Mak-
ing the ‘most’ out of rfid technology: A research
agenda for the study of the adoption, usage and im-
pact of rfid. Inf. Technol. and Management, 8(2).
de Seta, G. (2023). Qr code: The global making of an
infrastructural gateway. Global Media and China,
8(3):362–380.
Famili, A., Stavrou, A., Wang, H., and Park, J.-M.
(2023). Pilot: High-precision indoor localization for
autonomous drones. IEEE Transactions on Vehicular
Technology, 72(5):6445–6459.
George, A., Koivum
¨
aki, N., Hakala, T., Suomalainen, J.,
and Honkavaara, E. (2023). Visual-inertial odometry
using high flying altitude drone datasets. Drones, 7(1).
Konsynski, B. R. and Smith, H. A. (2003). Developments
in practice x: Radio frequency identification (rfid) -
an internet for physical objects. Commun. Assoc. Inf.
Syst., 12:19.
Malang, C., Charoenkwan, P., and Wudhikarn, R. (2023).
Implementation and critical factors of unmanned
aerial vehicle (uav) in warehouse management: A sys-
tematic literature review. Drones, 7(2).
R
´
acz-Szab
´
o, A., Ruppert, T., B
´
antay, L., L
¨
ocklin, A., Jakab,
L., and Abonyi, J. (2020). Real-time locating system
in production management. Sensors, 20(23).
Xu, Y., Shmaliy, Y. S., Ahn, C. K., Tian, G., and Chen, X.
(2018). Robust and accurate uwb-based indoor robot
localisation using integrated ekf/efir filtering. IET
Radar, Sonar & Navigation, 12(7):750–756.
1
www.cogniman.eu
ICINCO 2024 - 21st International Conference on Informatics in Control, Automation and Robotics
364