Towards UAV-USV Collaboration in Harsh Maritime Conditions
Including Large Waves
Filip Nov
´
ak
a
, Tom
´
a
ˇ
s B
´
a
ˇ
ca
b
, Ond
ˇ
rej Proch
´
azka
c
and Martin Saska
d
Department of Cybernetics, Faculty of Electrical Engineering, Czech Technical University in Prague, Czech Republic
Keywords:
Unmanned Aerial Vehicle, Unmanned Surface Vehicle, Boat Dynamics, Boat Model, State Estimation.
Abstract:
This paper introduces a system designed for tight collaboration between Unmanned Aerial Vehicles (UAVs)
and Unmanned Surface Vehicles (USVs) in harsh maritime conditions characterized by large waves. This
onboard UAV system aims to enhance collaboration with USVs for following and landing tasks under such
challenging conditions. The main contribution of our system is the novel mathematical USV model, describing
the movement of the USV in 6 degrees of freedom on a wavy water surface, which is used to estimate and
predict USV states. The estimator fuses data from multiple global and onboard sensors, ensuring accurate
USV state estimation. The predictor computes future USV states using the novel mathematical USV model
and the last estimated states. The estimated and predicted USV states are forwarded into a trajectory planner
that generates a UAV trajectory for following the USV or landing on its deck, even in harsh environmental
conditions. The proposed approach was verified in numerous simulations and deployed to the real world,
where the UAV was able to follow the USV and land on its deck repeatedly.
1 INTRODUCTION
The Unmanned Aerial Vehicles (UAVs) have already
proven their efficiency in numerous marine applica-
tions. The UAVs are helpful in search and rescue op-
erations (Murphy et al., 2008), monitoring marine an-
imals (Aniceto et al., 2018), monitoring water qual-
ity (Rom
´
an et al., 2023), or cleaning oceans from
garbage and oil spills (Han and Ma, 2021). However,
the UAVs are limited by short battery life, which re-
duces their operational time. This limitation is prob-
lematic as marine tasks are often executed at long
distances from offshore base stations. Therefore, the
UAVs often collaborate with the Unmanned Surface
Vehicles (USVs) (Han and Ma, 2021; Murphy et al.,
2006), which can compensate for UAVs short battery
life by power umbilical tether (Talke et al., 2018) or
providing docking spot for battery recharging (Aissi
et al., 2020).
In order to provide power supply via an umbilical
tether from the USV to the UAV, the UAV has to pre-
cisely follow the USV at the specified distance based
a
https://orcid.org/0000-0003-3826-5904
b
https://orcid.org/0000-0001-9649-8277
c
https://orcid.org/0009-0009-2224-750X
d
https://orcid.org/0000-0001-7106-3816
Figure 1: The tight collaboration between UAVs (marked
with white circles) and USVs during the landing and fol-
lowing tasks using our system presented in this paper.
on the tether length (Talke et al., 2018), which re-
quires estimating and predicting USV movement even
in harsh conditions such as rough waters with large
waves. Similarly, estimating and predicting USV
movement is crucial for the UAV during landing on
the USV docking spot (Gupta et al., 2023). The os-
cillating and tilting USV on a wavy water surface can
significantly damage the landing UAV or even cause
UAV to fall into the water.
The approach proposed in this paper allows tight
collaboration between UAVs and USVs (see Fig. 1),
such as following and landing tasks on rough water
surfaces. The key component of the presented system
is the USV state estimator, which runs onboard the
UAV and fuses data from the onboard sensors of both
robots. Using a novel mathematical USV model con-
taining wave dynamics, the estimated states are used
to predict future USV movement on the wavy water
Novák, F., Bá
ˇ
ca, T., Procházka, O. and Saska, M.
Towards UAV-USV Collaboration in Harsh Maritime Conditions Including Large Waves.
DOI: 10.5220/0012910000003822
Paper published under CC license (CC BY-NC-ND 4.0)
In Proceedings of the 21st International Conference on Informatics in Control, Automation and Robotics (ICINCO 2024) - Volume 1, pages 545-554
ISBN: 978-989-758-717-7; ISSN: 2184-2809
Proceedings Copyright © 2024 by SCITEPRESS – Science and Technology Publications, Lda.
545
surface. The estimated and predicted USV states en-
able precise UAV trajectory planning for following
and landing. The proposed approach was verified in
numerous simulations and real-world experiments in
both use cases: following the USV and landing on the
USV docking spot.
The main paper contributions are summarized in
the following points:
We propose a novel linear USV mathematical
model containing wave dynamics that enables
accurate estimation and prediction of the USV
movement on wavy water surface in 6 Degrees of
Freedom (DOFs).
We introduce a novel onboard UAV system to
tight collaboration between UAVs and USVs in
following and landing tasks in harsh maritime
conditions, including large waves.
2 RELATED WORKS
The works (Meng et al., 2019; Lee et al., 2019) pro-
pose solutions for the following and landing of fixed-
wing UAVs on marine vessels. The method presented
in (Meng et al., 2019) uses the auto-regressive model
to predict the landing pad position at the touchdown
moment. The landing pad is equipped with infrared
targets that are detected onboard UAV to measure the
relative position of the landing pad. The approach
(Lee et al., 2019) relies on a sliding-mode control
scheme to guide the UAV towards the landing point
by following a desired reference trajectory. The po-
sition of the landing point at touchdown time is pre-
dicted to be used in the final phase of the landing.
However, neither of these methods has been verified
through real-world experiments, and the harsh envi-
ronment is not considered in simulations.
An internal-model-based approach for Vertical
Take-Off and Landing (VTOL) UAVs is introduced in
(Marconi et al., 2002). This approach is designed for
autonomous landing on a vertically oscillating deck.
The oscillations of the deck are modeled as a sum
of sinusoidal functions. However, the oscillations are
considered only in heave motion. The approach does
not consider the pitch and roll motions of the landing
deck. Furthermore, the approach is not deployed in
the real world, and its validation is limited to simula-
tions.
A visual-based autonomous landing of UAV on a
moving USV is presented in (Keller and Ben-Moshe,
2022). The method (Keller and Ben-Moshe, 2022)
relies only on the position of the landing platform.
However, it’s important to note that this method does
not account for the roll and pitch motions during land-
ing, which can pose a risk, particularly in rough water
conditions where the rolling and pitching of the land-
ing platform may potentially damage the UAV. Al-
though the method (Keller and Ben-Moshe, 2022) is
verified in real-world experiments, the method is not
designed for harsh environments. A similar vision-
based approach is presented in (Venugopalan et al.,
2012). This approach also considers only USV po-
sition as (Keller and Ben-Moshe, 2022) and is de-
ployed in the real world. Neither methods (Keller and
Ben-Moshe, 2022; Venugopalan et al., 2012) analyze
waves influencing the USV motions, which does not
make them prepared for harsh conditions.
The method for following and landing verified in
the real world is presented in (Xu et al., 2020). In
this method, the UAV uses a camera to detect a tag
placed on the landing platform, which is placed on-
board USV. The detected tag is then used to estimate
the relative position from the UAV to the USV. A
similar estimation method is also employed in (Yang
et al., 2021). Both methods are verified by conducting
real-world experiments. However, harsh conditions,
such as large waves, as well as pitch and roll motions
of the USV, are not considered in these methods.
The complex system estimating USV motion in
5 DOFs is proposed in (Abujoub et al., 2018). The
estimation method differs from previous vision-based
approaches by utilizing Light Detection and Rang-
ing (LiDAR) to measure the position and orientation
of the USV. Additionally, the future roll and pitch
motions are predicted as the sum of harmonic func-
tions representing the wave motions. The parame-
ters of these harmonic functions are derived from Fast
Fourier Transform (FFT) of estimated roll and pitch
angles. However, this method does not consider wave
motion in the heave state of the USV, which could
potentially lead to unsafe landing or following, espe-
cially in rough conditions. Furthermore, the method
is solely tested in simulations and has not been vali-
dated in real-world experiments.
The above-mentioned methods (Keller and Ben-
Moshe, 2022; Venugopalan et al., 2012; Xu et al.,
2020; Yang et al., 2021; Abujoub et al., 2018) rely
solely on vision methods to estimate USV states.
However, a significant limitation of these methods
arises when the USV falls outside the Field of View
(FOV) of the UAV sensors, rendering it impossible
to estimate the position and orientation of the USV.
In such cases, the UAV must actively search for the
USV, leading to increased energy consumption and
limiting the time dedicated to the UAV mission. An
alternative approach, as presented in (Zhang et al.,
2021), combines Global Navigation Satellite System
ICINCO 2024 - 21st International Conference on Informatics in Control, Automation and Robotics
546
(GNSS) sensors placed on the USV with onboard
UAV vision-based systems to estimate the position of
the USV. However, it’s important to note that nei-
ther the orientation of the USV nor wave motions
are accounted for in the estimation process. Conse-
quently, the method presented in (Zhang et al., 2021)
is deemed unsuitable for harsh conditions.
The method proposing USV state estimation in
full 6 DOFs is presented in (Polvara et al., 2018). The
UAV detects the tag placed on USV board in cam-
era images and utilizes it for the estimation. Subse-
quently, the method controls the UAV landing on the
USV board. However, the waves are not considered
for any USV state, making the method unsuitable for
harsh conditions that are considered in this paper. The
method for landing UAV on USV in harsh conditions
is presented in (Gupta et al., 2023). The method con-
siders waves in the USV motion and predicts future
USV states to perform a safe landing. However, the
assumption made in (Gupta et al., 2023) that the USV
is not moving horizontally on the water surface ren-
ders the method unsuitable for the task of UAV fol-
lowing. Additionally, both methods (Polvara et al.,
2018; Gupta et al., 2023) are purely vision-based,
causing aforementioned issues if the USV is not in
the FOV of the UAV onboard sensors.
In comparison with the above-mentioned meth-
ods, our system integrates both USV onboard sensors
and UAV onboard sensors to accurately estimate the
USV motion in full 6 DOFs, while factoring in wave
dynamics at every state. Additionally, our system pre-
dicts the future USV states in 6 DOFs, incorporating
wave motions. Therefore, our system enables tight
UAV-USV collaboration in extreme environmental
conditions at different relative UAV-USV distances
constrained by the communication range. The esti-
mated and predicted USV states serve as inputs to the
UAV trajectory planner, based on the Model Predic-
tive Control (MPC) method, enabling the UAV to fol-
low the USV and land on it even in harsh environmen-
tal conditions.
3 MATHEMATICAL USV MODEL
We identify accurate state estimation and prediction
of USV states as crucial features for tight UAV-USV
collaboration, enabling the UAV to follow the USV
and land on its deck even in harsh conditions, includ-
ing large waves. In this section, we present a novel
linear USV model containing wave dynamics in order
to fuse data from multiple sensors, thereby increas-
ing estimation and prediction accuracy. We model the
USV in 6 DOFs, consisting of 3D translation (surge x,
W
w
y
w
x
w
z
VP
vp
y
vp
x
vp
z
b
b,x
(surge)
b
b,y
(sway)
b
b,z
(heave)
B
b
Figure 2: The depiction of the world frame W =
{w
x
, w
y
, w
z
}, Vessel parallel coordinate system V P =
{vp
x
, vp
y
, vp
z
}, and USV body-fixed coordinate frame
B
b
= {b
b,x
, b
b,y
, b
b,z
}.
sway y, and heave z) and 3D rotation in terms of in-
trinsic Euler angles (roll φ, pitch θ, and yaw ψ), as
illustrated in Fig. 2.
In order to analyze the USV motion, three co-
ordinate frames are presented: the world coordinate
frame W = {w
x
, w
y
, w
z
}, the body-fixed coordinate
frame B
b
= {b
b,x
, b
b,y
, b
b,z
}, and the Vessel parallel
coordinate frame V P = {vp
x
, vp
y
, vp
z
} as shown in
Fig. 2. The position p
p
p
L
= (x
L
, y
L
, z
L
)
and rotation
Θ
Θ
Θ
L
= (φ
L
, θ
L
, ψ
L
)
of the USV are expressed in the
Vessel parallel coordinate frame V P , which is paral-
lel to the body-fixed coordinate frame B
b
and placed
at the origin of the world coordinate frame W . The
USV state, expressed in Vessel parallel coordinate
frame η
η
η
L
= (p
p
p
L
, Θ
Θ
Θ
L
)
, is transformed to the world
coordinate frame as η
η
η = (p
p
p
, Θ
Θ
Θ
)
as
η
η
η = J
ψ
(ψ)η
η
η
L
, (1)
where transformation matrix J
ψ
(ψ) is defined as
J
ψ
(ψ) =
R
ψ
O
3×3
O
3×3
I
3×3
, (2)
R
ψ
=
cosψ sin ψ 0
sinψ cosψ 0
0 0 1
, (3)
O
3×3
R
3×3
is a zero matrix, and I
3×3
R
3×3
de-
notes the identity matrix. The Vessel parallel coordi-
nate frame V P enables us to use an identity matrix
as the transformation between velocity state vector ν
ν
ν
and the derivative of vector η
η
η
L
. The linear velocity
v
v
v = (u, v, w)
and angular velocity ω
ω
ω = (p, q, r)
form
the state vector ν
ν
ν = (v
v
v
, ω
ω
ω
)
, expressed in the body-
fixed coordinate frame B
b
.
Our novel USV model builds upon the USV mo-
tion analysis presented in (Fossen, 2011) and extends
it by wave dynamics, resulting in a mathematical
model that describes the USV motion on a rough wa-
ter surface. First, the equations of USV motions are
˙
η
η
η
L
= ν
ν
ν, (4)
˙
ν
ν
ν = (M
I
+ M
A
)
1
(Dν
ν
ν Gη
η
η
L
), (5)
Towards UAV-USV Collaboration in Harsh Maritime Conditions Including Large Waves
547
where M
I
R
6×6
represents the inertia matrix, the
M
A
R
6×6
is hydrodynamic added mass occurring
due to the motion of the USV through the fluid. The
matrix D R
6×6
represents the linear damping and
term G R
6×6
denotes matrix of gravitational forces
and torques, also called restoring forces.
We assume the waves as an oscillatory motion
in each USV state. Therefore, we present one wave
component as a 2 DOF linear state-space model, de-
fined by matrices A
ω
, C
ω
as
˙x
ω
1
˙x
ω
2
=
0 1
ω
2
0
2λω
0
|
{z }
A
ω
x
ω
1
x
ω
2
, (6)
y
ω
=
0 1
| {z }
C
ω
x
ω
1
x
ω
2
, (7)
where x
x
x
ω
= (x
ω
1
, x
ω
2
)
is a state of the wave com-
ponent, λ is a damping term, and ω
0
denotes the fre-
quency of the wave component. To model wave mo-
tion with complex frequency spectra, we combined
N
c
Z
+
wave components from equations (6) and
(7). Each component is characterized by different pa-
rameters λ and ω
0
˙
x
x
x
ω
1
= A
ω
1
x
x
x
ω
1
, (8)
y
ω
1
= C
ω
1
x
x
x
ω
1
, (9)
.
.
.
˙
x
x
x
ω
N
c
= A
ω
N
c
x
x
x
ω
N
c
, (10)
y
ω
N
c
= C
ω
N
c
x
x
x
ω
N
c
, (11)
y
wave
= y
ω
1
+ . . . + y
ω
N
c
. (12)
The equations (8)-(12) can be expressed as
˙
x
x
x
wave
= A
wave
x
x
x
wave
, (13)
y
wave
= C
wave
x
x
x
wave
, (14)
x
x
x
wave
= (x
x
x
ω
1
, . . . , x
x
x
ω
N
c
)
, (15)
A
wave
= diag{A
ω
1
, . . . , A
ω
N
c
1
}, (16)
C
wave
=
C
ω
1
··· C
ω
N
c
, (17)
where diag{·} is a symbol for a block diagonal matrix
created from the elements in the bracket. The wave
model (13)–(14) is integrated into each state of the
USV state vector ν
ν
ν. Hence, we present the complex
model of waves influencing USV motion as
˙
x
x
x
wave,ν
ν
ν
= A
wave,ν
ν
ν
x
x
x
wave,ν
ν
ν
, (18)
y
y
y
wave,ν
ν
ν
= C
wave,ν
ν
ν
x
x
x
wave,ν
ν
ν
, (19)
where A
wave,ν
ν
ν
and C
wave,ν
ν
ν
are block diagonal matrices
A
wave,ν
ν
ν
= diag{A
wave
, A
wave
, A
wave
, A
wave
,
A
wave
, A
wave
}, (20)
C
wave,ν
ν
ν
=
C
wave
O
1×2N
c
. . . O
1×2N
c
O
1×2N
c
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
C
wave
O
1×2N
c
O
1×2N
c
. . . O
1×2N
c
C
wave
,
(21)
and
x
x
x
wave,ν
ν
ν
= (x
x
x
wave,u
, x
x
x
wave,v
, x
x
x
wave,w
,
x
x
x
wave, p
, x
x
x
wave,q
, x
x
x
wave,r
)
. (22)
Finally, the novel 6 DOF mathematical model of
the USV containing wave dynamics is
˙
x
x
x
usv
= A
usv
x
x
x
usv
, (23)
where x
x
x
usv
= (η
η
η
L
, ν
ν
ν
, x
x
x
wave,ν
ν
ν
)
,
A
usv
=
O
6×6
I
6×6
O
6×12N
c
M
1
G M
1
D C
wave,ν
ν
ν
O
12N
c
×6
O
12N
c
×6
A
wave,ν
ν
ν
, (24)
and M = M
I
+ M
A
.
4 UAV-USV COLLABORATION IN
WAVES
The pipeline integrating the entire UAV-USV sys-
tem is depicted in Fig. 3. The pipeline comprises
three main areas USV plant, UAV plant, and the
proposed approach for tight UAV-USV collaboration.
The USV plan consists of a Global Positioning Sys-
tem (GPS) sensor and an Inertial Measurement Unit
(IMU), whose data are sent to our state estimator. The
UAV plant includes onboard sensors AprilTag de-
tector (Olson, 2011; Wang and Olson, 2016; Krogius
et al., 2019), and UltraViolet Direction And Rang-
ing (UVDAR) system (Walter et al., 2020; Walter
et al., 2018b; Walter et al., 2018a), which are dis-
cussed later in this paper. The UAV is controlled by
the Multi-robot Systems (MRS) UAV system (Baca
et al., 2021), enabling precise following of a trajectory
planned by the proposed method. The State estimator
that fuses data from USV and UAV onboard sensors
using a novel mathematical USV model (23) to ob-
tain an accurate estimate of USV states moving on
rough water surfaces. The estimated states and novel
mathematical model are then used in the State predic-
tor to predict future USV states. The estimated and
predicted USV states are forwarded to the Trajectory
planner, which generates a UAV trajectory to follow
the USV and land on its deck.
ICINCO 2024 - 21st International Conference on Informatics in Control, Automation and Robotics
548
GPS
IMU
State
estimator
State
predictor
Trajectory
planner
MRS UAV
system
AprilTag
detector
UVDAR
system
10 Hz 100 Hz
100 Hz
100 Hz 50 Hz
30 Hz
UAV-USV collaboration
USV plant
UAV plant
Figure 3: The figure depicts a pipeline diagram of the entire system used for experimental verification in this paper. The State
estimator fuses data from USV onboard sensors (GPS and IMU) and UAV onboard sensors (AprilTag detector and UVDAR
system). The estimated USV states are then sent to the State predictor, which predicts future USV states. The Trajectory
planner uses the estimated and predicted USV states to generate a UAV trajectory, which is precisely tracked by the MRS UAV
system (Baca et al., 2021).
4.1 State Estimator and Predictor
We rely on the Linear Kalman Filter (LKF) (Kalman,
1960) as the state estimator, utilizing a discrete ver-
sion A
usv,d
of our novel linear USV model (23). The
LKF consists of two main steps: the prediction step
and the correction step. The prediction step uses the
last estimated USV state and propagates it through the
mathematical model to obtain a new state
x
x
x
usv
(k + 1) = A
usv,d
x
x
x
usv
(k), (25)
P
usv
(k + 1) = A
usv,d
P
usv
(k)A
usv,d
+ Q
usv
, (26)
where k Z
+
is a time step, P
usv
(k)
R
12(1+N
c
)×12(1+N
c
)
stands for covariance matrix
of USV state x
x
x
usv
(k), and Q
usv
R
12(1+N
c
)×12(1+N
c
)
represents a system noise matrix. The correction step
incorporates incoming measurement z
z
z(k) to update
the last estimated state
x
x
x
usv
(k) = x
x
x
usv
(k) + G(k)(z
z
z(k) Cx
x
x
usv
(k)) , (27)
P
usv
(k) = P
usv
(k) G(k)CP
usv
(k), (28)
G(k) = P
usv
(k)C
(CP
usv
(k)C
+ R)
1
, (29)
where R is measurement noise matrix, and matrix C
represents mapping between state x
x
x
usv
and measure-
ment z
z
z. To predict future USV states from the last es-
timated values x
x
x
usv
(k) and P
usv
(k), the prediction step
of the LKF is iteratively applied to obtain N
p
Z
+
number of predictions
ˆ
x
x
x
usv
(k
p
+ 1) = A
usv,d
ˆ
x
x
x
usv
(k
p
), (30)
ˆ
P
usv
(k
p
+ 1) = A
usv,d
ˆ
P
usv
(k
p
)A
usv,d
+ Q
usv
, (31)
where k
p
= 0, 1, . . . , N
p
1,
ˆ
x
x
x
usv
(0) = x
x
x
usv
(k), and
ˆ
P
usv
(0) = P
usv
(k).
4.2 Onboard USV Sensors
In practical scenarios, the UAV often operates at dis-
tance where its onboard sensors cannot provide suffi-
cient data for USV state estimation. To address this
issue, we assume a communication link between the
UAV and USV, operating at least on 10 Hz. The USV
sends sensor data to the UAV to roughly estimate the
USV state. Subsequently, the UAV can fly to the
proximity of the USV to use onboard UAV sensors,
thereby increasing estimation and prediction preci-
sion. The first onboard USV sensor is a GPS device
providing global position information. The second
onboard USV sensor is an IMU that measures head-
ing, angular velocity, and linear acceleration. These
USV sensors are integrated within our MRS boat unit,
which is placed on the USV board, as depicted in
Fig. 4.
4.3 Onboard UAV Sensors
Establishing a fast and reliable communication link
in the real world is challenging (Tran and Ahn, 2019).
To deal with drop-outs of communication links, the
UAV carries two onboard sensors that do not rely on
a communication link. These onboard UAV sensors
increase redundancy and enable usage of the system
in various real-world conditions. Moreover, the two
onboard sensory modalities demonstrate the system’s
ability to fuse data from multiple UAV and USV sen-
sors that can be chosen according to the desired appli-
cation.
Figure 4: The USV board showing an AprilTag, UV LED
(marked with red circles), and MRS boat unit, which con-
tains GPS and IMU.
Towards UAV-USV Collaboration in Harsh Maritime Conditions Including Large Waves
549
The first sensor is the AprilTag detector (Olson,
2011; Wang and Olson, 2016; Krogius et al., 2019).
This vision-based system detects landmarks known as
AprilTag in the camera image frame. The used land-
mark placed on USV board is shown in Fig. 4. In
this custom landmark layout, a smaller AprilTag is
placed within the empty space of the larger AprilTag,
enabling detection from various distances (Krogius
et al., 2019). The measured data from the April-
Tag detector includes the position and orientation of
the detected landmark. However, the AprilTag detec-
tor relies on sufficient lighting conditions to provide
quality measurements due to its passive landmarks,
which limits the system’s usage, for example, in dark
environments.
The second onboard UAV sensor is the UVDAR
system (Walter et al., 2020; Walter et al., 2018b; Wal-
ter et al., 2018a), which detects blinking UltraVio-
let (UV) Light-Emitting Diodes (LEDs) placed on the
target in camera image frames. The active blinking of
the LEDs allows our solution to be used even in poor
light conditions, such as darkness, where the April-
Tag detector fails to provide sufficient measurements.
The UV LEDs placed on the USV board are shown in
Fig. 4. The UVDAR system provides measurements
of the position and orientation of the target.
4.4 Trajectory Planner
The trajectory planner used for experimental verifi-
cation is based on linear MPC, with detailed descrip-
tions provided in (Prochazka, 2023). This planner uti-
lizes estimated and predicted USV states (Sec. 4.1)
to align the UAV trajectory with the motion of the
USV. When the task is to follow the USV, the UAV
maintains a desired distance above the USV board
and promptly responds to changes in USV movement,
such as those induced by waves. This responsiveness
is important in applications where the UAV is tethered
to the USV using a power supply cable to recharge
UAV batteries (Talke et al., 2018). Failure of the UAV
to react to the motion of the USV on waves poses a
risk of the cable pulling the UAV and destabilizing it.
The landing task includes even more challenges.
Firstly, the UAV must track the USV steadily. Sub-
sequently, the UAV begins descending towards the
USV board. In the final phase, the UAV lands on
the USV board at a predefined vertical velocity rel-
ative to the USV velocity. This controlled descent is
crucial for a safe landing, as it ensures that the verti-
cal and touchdown velocity of the UAV are regulated
throughout the entire maneuver, regardless of the ver-
tical motion of the USV in the waves. Maintaining
a static descending velocity despite the vertical mo-
tion of the USV carries a significant risk, as waves
may push the USV towards the UAV, which signifi-
cantly increases the UAV touchdown velocity, poten-
tially causing damage to the UAV.
5 VERIFICATION
The proposed approach was first verified in simula-
tions and subsequently deployed in real-world exper-
iments. Moreover, we compare our results with those
obtained using the state-of-the-art method (Polvara
et al., 2018). The simulations were performed in a re-
alistic robotic simulator Gazebo, extended by the Vir-
tual RobotX (VRX) simulator (Bingham et al., 2019),
which provides a realistic simulation of the harsh
marine environment with large waves, as the UAV-
USV collaboration in such conditions is our main
motivation. A video attachment supporting the re-
sults of this paper is available at https://mrs.fel.cvut.
cz/papers/towards-uav-usv-collaboration. The nov-
elty of our approach lies in USV state estimation
and prediction using our proposed novel USV model
containing wave dynamics (Sec. 3). Therefore, we
evaluate our estimation and prediction using Root
Mean Square Error (RMSE). The Table 1 presents
the RMSE of the estimated USV states using our ap-
proach computed from the performed simulations, in
which the UAV follows and lands on the USV in harsh
environment. The RMSE values are computed with
respect to the sensors used for the estimation.
The estimation of position (x, y, z) and correspond-
ing linear velocities (u, v, w) using GPS yields re-
sults with the largest RMSE. A better estimation for
the states (x, y, z) and (u, v, w) compared to GPS is
Table 1: RMSE of the estimated USV states using the ap-
proach proposed in this paper with respect to the individual
sensors.
sensor
RMSE
(x, y, z)
m
RMSE
(φ, θ, ψ)
rad
RMSE
(u, v, w)
m/s
RMSE
(p, q, r)
rad/s
GPS 0.989 - 0.985 -
IMU - 0.011 - 0.536
UVDAR 0.425 0.124 0.977 1.034
AprilTag 0.088 0.052 0.848 0.606
Table 2: RMSE of the estimated USV states using the ap-
proach proposed in this paper compared to a state-of-the-art
method.
method
RMSE
(x, y, z)
m
RMSE
(φ, θ, ψ)
rad
RMSE
(u, v, w)
m/s
RMSE
(p, q, r)
rad/s
SOTA 0.313 0.056 0.580 0.329
our approach 0.116 0.017 0.303 0.235
ICINCO 2024 - 21st International Conference on Informatics in Control, Automation and Robotics
550
0 10 20 30 40 50 60 70 80
time (s)
115
120
125
x (m)
0 10 20 30 40 50 60 70 80
time (s)
-74
-73
-72
-71
y (m)
0 10 20 30 40 50 60 70 80
time (s)
0
1
2
z (m)
0 10 20 30 40 50 60 70 80
time (s)
-5
0
5
roll (deg)
0 10 20 30 40 50 60 70 80
time (s)
-5
0
5
pitch (deg)
0 10 20 30 40 50 60 70 80
time (s)
155
160
165
170
yaw (deg)
Estimated (our approach) Estimated (SOTA) GT
Figure 5: Estimated USV position p
p
p = (x, y, z) and orienta-
tion Θ
Θ
Θ = (φ, θ, ψ) using the method proposed in this paper.
achieved through the UVDAR system. The RMSE
for states (x, y, z) is half the size when using the
UVDAR system compared to the GPS, and for states
(u, v, w), the RMSE using the UVDAR system is
slightly smaller than when using the GPS sensor.
However, for the states (φ, θ, ψ) and corresponding
angular velocities (p, q, r), the UVDAR yields the
largest RMSE among all sensors. The most accurate
estimation of the states (φ, θ, ψ) is attained using IMU
data, as indicated by its minimal RMSE. The smallest
RMSE for (x, y, z) and (u, v, w) is achieved using the
AprilTag detector. Moreover, the estimation of states
(φ, θ, ψ) and (p, q, r) using the AprilTag detector re-
sults in a RMSE that is half as large as that of the
UVDAR system. However, the RMSE of the April-
Tag detector for states (φ, θ, ψ) remains four and a
half times larger than that of the IMU for the same
states.
The Fig. 5 shows the estimated USV states from
one of the many performed simulations, in which our
approach fused data from all sensors. The graphs
illustrate that all estimated states correspond to the
Ground Truth (GT) values. The RMSE of estimated
states using our approach is provided in the Table 2.
We compared our approach with the most relevant
state-of-the-art method (Polvara et al., 2018), which
we call SOTA in this paper, as depicted in Fig. 5. Our
Table 3: RMSE of predicted and estimated USV states us-
ing the method proposed in this paper.
USV states (our approach)
RMSE
(x, y, z)
m
RMSE
(φ, θ, ψ)
rad
predicted states 0.737 0.196
estimated states 0.116 0.017
approach has smaller RMSE for all USV states (Ta-
ble 2). The main difference is observed in the esti-
mation of orientation (φ, θ, ψ), where our approach
achieves more than three times smaller RMSE com-
pared to the SOTA. This substantial improvement in
orientation estimation can be attributed to the overall
UAV-USV collaboration system by the incorporation
waves in the mathematical model and the fusion of
data from multiple sensors, features lacking in SOTA.
The impact of this difference is evident in Fig. 5 in
the estimation of heave z, roll φ, and pitch θ, where
our approach exhibits closer alignment with GT val-
ues compared to SOTA.
The predicted USV states (x, y, z, φ, θ, ψ) from one
of the performed simulations are shown in Fig. 6. The
two-second predictions are computed every two sec-
onds. The figure illustrates that the predictions ini-
tially deviate more from the GT values at the begin-
ning of the simulation. However, as the estimation
progresses over time, the predicted USV states be-
come increasingly accurate. The RMSE of predic-
tions for states (x, y, z) proposed in Table 3 corre-
sponds to the RMSE of estimation of these states us-
ing only the GPS sensor (Table 1). The RMSE of pre-
dictions for states (φ, θ, ψ) corresponds to the RMSE
of these states estimated using UVDAR system (Ta-
ble 1). These results demonstrate the applicability of
computed predictions to UAV trajectory planner.
5.1 Real-World Experiments
To analyze the real-world performance and to show
robustness to real uncertainties and disturbances, we
deployed the presented approach in real-world exper-
iments. During the first set of real-world experiments,
the overall system demonstrated robust performance,
allowing the UAV to repeatedly follow the USV and
land on the USV’s deck. In these experiments, the
USV was manually forced to perform wave motions,
as depicted in Fig. 7 (a). The UAV was equipped with
an onboard computer, GPS sensor, and cameras for
the AprilTag detector and the UVDAR system (see
(Hert et al., 2023; Hert et al., 2022) for details). The
USV’s deck, featuring the AprilTag and UV LEDs,
is detailed in Fig. 4. Additionally, the USV carried
the MRS boat unit containing IMU and GPS sensors.
The detailed description of the used UAV and USV
Towards UAV-USV Collaboration in Harsh Maritime Conditions Including Large Waves
551
0 10 20 30 40 50 60 70 80
time (s)
10
15
20
x (m)
0 10 20 30 40 50 60 70 80
time (s)
-40.5
-40
-39.5
y (m)
0 10 20 30 40 50 60 70 80
time (s)
0
1
2
z (m)
0 10 20 30 40 50 60 70 80
time (s)
-5
0
5
roll (deg)
0 10 20 30 40 50 60 70 80
time (s)
-10
-5
0
5
10
pitch (deg)
0 10 20 30 40 50 60 70 80
time (s)
-15
-10
-5
0
5
yaw (deg)
Predicted Estimated GT
Figure 6: Predicted USV position p
p
p = (x, y, z) and orienta-
tion Θ
Θ
Θ = (φ, θ, ψ) using the method proposed in this paper.
sensors is provided in Sec. 4.2 and 4.3.
The estimated USV states from one of the per-
formed real-world experiments are depicted in Fig. 8.
Initially, the UAV approached the USV while USV
states were estimated using received GPS and IMU
data. Subsequently, the UAV onboard sensors im-
proved the estimation, as evident in the graphs of
USV position (x, y, z) (Fig. 8). Then, the UAV fol-
lowed the USV, which performed artificially induced
wave motion that can be noticed especially in roll and
pitch graphs. The pitch was affected by waves from
37 s to 130 s and from 200 s to 264 s, while the roll
was affected from 130 s to 200 s.
The UAV was also able to successfully land on the
USV, as depicted in Fig. 7 (b). The USV was towed
by another boat, while the UAV followed it using a
trajectory planner (Sec. 4.4) that utilized estimated
and predicted USV states. When the conditions for
(a) (b)
Figure 7: The figure depicts snapshots from the performed
real-world experiments, in which the UAV (marked with a
red circle) followed the USV (a) and landed on the USV (b).
0 50 100 150 200 250
time (s)
60
70
80
x (m)
Estimated (our approach)
UVDAR
AprilTag
GPS
IMU
0 50 100 150 200 250
time (s)
-40
-35
-30
-25
y (m)
0 50 100 150 200 250
time (s)
-5
0
5
z (m)
0 50 100 150 200 250
time (s)
-20
0
20
roll (deg)
0 50 100 150 200 250
time (s)
-20
-10
0
10
20
pitch (deg)
0 50 100 150 200 250
time (s)
-200
-100
0
yaw (deg)
waves in pitch
waves in roll
waves in pitch
Figure 8: Estimated USV states using our approach in one
of the performed real-world experiments.
a safe landing were met, the UAV initiated a landing
maneuver, which was successfully completed within
6 s.
6 CONCLUSION
In this paper, we introduce a novel onboard UAV ap-
proach designed to facilitate tight collaboration be-
tween UAV and USV in harsh marine environments,
such as following or landing maneuvers. The main
contribution of the proposed solution is the USV state
estimator and predictor, operating in 6 DOFs, which
uses our novel mathematical USV model incorporat-
ing wave dynamics. The state estimator fuses data
from multiple UAV and USV sensors, ensuring accu-
rate estimation across various real-world conditions.
Subsequently, the estimated USV states are fed into
the state predictor, which utilizes the mathematical
USV model to predict future USV states in 6 DOFs.
We verified the overall system through extensive sim-
ulations and compared the results of the proposed
approach with the state-of-the-art method. The pro-
posed approach was also deployed in real-world ex-
periments, where the UAV was able to repeatedly fol-
low the USV and land on the USV’s deck. Future
ICINCO 2024 - 21st International Conference on Informatics in Control, Automation and Robotics
552
work will focus on developing a nonlinear mathemat-
ical model of the USV to better capture its dynamics
on a wavy water surface. Moreover, the proposed ap-
proach is planned to be integrated into an autonomous
UAV-USV team designed for garbage removal and
water quality monitoring.
ACKNOWLEDGEMENTS
This work was funded by the Czech Science Foun-
dation (GA
ˇ
CR) under research project no. 23-
07517S, by the European Union under the project
Robotics and advanced industrial production (reg. no.
CZ.02.01.01/00/22 008/0004590), and by CTU grant
no SGS23/177/OHK3/3T/13.
REFERENCES
Abujoub, S., McPhee, J., Westin, C., and Irani, R. A.
(2018). Unmanned aerial vehicle landing on maritime
vessels using signal prediction of the ship motion. In
OCEANS 2018 MTS/IEEE Charleston, pages 1–9.
Aissi, M., Moumen, Y., Berrich, J., Bouchentouf, T.,
Bourhaleb, M., and Rahmoun, M. (2020). Au-
tonomous solar usv with an automated launch and re-
covery system for uav: State of the art and design.
In 2020 IEEE 2nd International Conference on Elec-
tronics, Control, Optimization and Computer Science
(ICECOCS), pages 1–6.
Aniceto, A. S., Biuw, M., Lindstrøm, U., Solbø, S. A.,
Broms, F., and Carroll, J. (2018). Monitoring marine
mammals using unmanned aerial vehicles: quantify-
ing detection certainty. Ecosphere, 9(3):e02122.
Baca, T., Petrlik, M., Vrba, M., Spurny, V., Penicka, R.,
Hert, D., and Saska, M. (2021). The MRS UAV
System: Pushing the Frontiers of Reproducible Re-
search, Real-world Deployment, and Education with
Autonomous Unmanned Aerial Vehicles. Journal of
Intelligent & Robotic Systems, 102(26):1–28.
Bingham, B., Aguero, C., McCarrin, M., Klamo, J., Malia,
J., Allen, K., Lum, T., Rawson, M., and Waqar,
R. (2019). Toward maritime robotic simulation in
Gazebo. In Proceedings of MTS/IEEE OCEANS Con-
ference, Seattle, WA.
Fossen, T. I. (2011). Handbook of Marine Craft Hydro-
dynamics and Motion Control. John Wiley & Sons,
United Kingdom, first edition edition.
Gupta, P. M., Pairet, E., Nascimento, T., and Saska, M.
(2023). Landing a uav in harsh winds and turbulent
open waters. IEEE Robotics and Automation Letters,
8(2):744–751.
Han, Y. and Ma, W. (2021). Automatic monitoring of wa-
ter pollution based on the combination of uav and
usv. In 2021 IEEE 4th International Conference on
Electronic Information and Communication Technol-
ogy (ICEICT), pages 420–424.
Hert, D., Baca, T., Petracek, P., Kratky, V., Penicka, R.,
Spurny, V., Petrlik, M., Vrba, M., Zaitlik, D., Stoudek,
P., Walter, V., Stepan, P., Horyna, J., Pritzl, V.,
Sramek, M., Ahmad, A., Silano, G., Bonilla Licea, D.,
Stibinger, P., Nascimento, T., and Saska, M. (2023).
MRS Drone: A Modular Platform for Real-World De-
ployment of Aerial Multi-Robot Systems. Journal of
Intelligent & Robotic Systems.
Hert, D., Baca, T., Petracek, P., Kratky, V., Spurny, V.,
Petrlik, M., Vrba, M., Zaitlik, D., Stoudek, P., Wal-
ter, V., Stepan, P., Horyna, J., Pritzl, V., Silano, G.,
Bonilla Licea, D., Stibinger, P., Penicka, R., Nasci-
mento, T., and Saska, M. (2022). MRS Modular UAV
Hardware Platforms for Supporting Research in Real-
World Outdoor and Indoor Environments. In 2022
International Conference on Unmanned Aircraft Sys-
tems (ICUAS), pages 1264–1273. IEEE.
Kalman, R. E. (1960). A new approach to linear filtering
and prediction problems. Journal of Basic Engineer-
ing, 82(1):35–45.
Keller, A. and Ben-Moshe, B. (2022). A robust and accu-
rate landing methodology for drones on moving tar-
gets. Drones, 6(4).
Krogius, M., Haggenmiller, A., and Olson, E. (2019). Flex-
ible layouts for fiducial tags. In IEEE/RSJ Interna-
tional Conference on Intelligent Robots and Systems
(IROS).
Lee, S., Lee, J., Lee, S., Choi, H., Kim, Y., Kim, S., and
Suk, J. (2019). Sliding mode guidance and control for
uav carrier landing. IEEE Transactions on Aerospace
and Electronic Systems, 55(2):951–966.
Marconi, L., Isidori, A., and Serrani, A. (2002). Au-
tonomous vertical landing on an oscillating plat-
form: an internal-model based approach. Automatica,
38(1):21–32.
Meng, Y., Wang, W., Han, H., and Ban, J. (2019). A vi-
sual/inertial integrated landing guidance method for
uav landing on the ship. Aerospace Science and Tech-
nology, 85:474–480.
Murphy, R., Stover, S., Pratt, K., and Griffin, C. (2006). Co-
operative damage inspection with unmanned surface
vehicle and micro unmanned aerial vehicle at Hurri-
cane Wilma. In 2006 IEEE/RSJ International Confer-
ence on Intelligent Robots and Systems, pages 9–9.
Murphy, R. R., Steimle, E., Griffin, C., Cullins, C., Hall,
M., and Pratt, K. (2008). Cooperative use of un-
manned sea surface and micro aerial vehicles at Hur-
ricane Wilma. Journal of Field Robotics, 25(3):164–
180.
Olson, E. (2011). AprilTag: A robust and flexible visual
fiducial system. In IEEE International Conference on
Robotics and Automation (ICRA), pages 3400–3407.
IEEE.
Polvara, R., Sharma, S., Wan, J., Manning, A., and Sut-
ton, R. (2018). Vision-based autonomous landing of a
quadrotor on the perturbed deck of an unmanned sur-
face vehicle. Drones, 2(2).
Prochazka, O. (2023). Trajectory planning for autonomous
landing of a multirotor helicopter on a boat. Master’s
thesis, Faculty of Electrical Engineering, Czech Tech-
nical University in Prague.
Towards UAV-USV Collaboration in Harsh Maritime Conditions Including Large Waves
553
Rom
´
an, A., Tovar-S
´
anchez, A., Gauci, A., Deidun, A., Ca-
ballero, I., Colica, E., D’Amico, S., and Navarro, G.
(2023). Water-quality monitoring with a uav-mounted
multispectral camera in coastal waters. Remote Sens-
ing, 15(1).
Talke, K. A., De Oliveira, M., and Bewley, T. (2018). Cate-
nary tether shape analysis for a uav - usv team. In
2018 IEEE/RSJ International Conference on Intelli-
gent Robots and Systems (IROS), pages 7803–7809.
Tran, Q. V. and Ahn, H.-S. (2019). Multi-agent localization
of a common reference coordinate frame: An extrin-
sic approach. IFAC-PapersOnLine, 52(20):67–72. 8th
IFAC Workshop on Distributed Estimation and Con-
trol in Networked Systems NECSYS 2019.
Venugopalan, T. K., Taher, T., and Barbastathis, G. (2012).
Autonomous landing of an unmanned aerial vehicle
on an autonomous marine vehicle. In 2012 Oceans,
pages 1–9.
Walter, V., N.Staub, Saska, M., and Franchi, A. (2018a).
Mutual localization of UAVs based on blinking ultra-
violet markers and 3D time-position Hough transform.
In 14th IEEE International Conference on Automation
Science and Engineering (CASE 2018).
Walter, V., Saska, M., and Franchi, A. (2018b). Fast mutual
relative localization of UAVs using ultraviolet LED
markers. In 2018 International Conference on Un-
manned Aircraft System (ICUAS 2018).
Walter, V., Vrba, M., and Saska, M. (2020). On training
datasets for machine learning-based visual relative lo-
calization of micro-scale UAVs. In 2020 IEEE In-
ternational Conference on Robotics and Automation
(ICRA), pages 10674–10680.
Wang, J. and Olson, E. (2016). AprilTag 2: Efficient and
robust fiducial detection. In IEEE/RSJ International
Conference on Intelligent Robots and Systems (IROS).
Xu, Z.-C., Hu, B.-B., Liu, B., Wang, X., and Zhang, H.-
T. (2020). Vision-based autonomous landing of un-
manned aerial vehicle on a motional unmanned sur-
face vessel. In 2020 39th Chinese Control Conference
(CCC), pages 6845–6850.
Yang, L., Liu, Z., Wang, X., Wang, G., Hu, X., and
Xi, Y. (2021). Autonomous landing of a rotor un-
manned aerial vehicle on a boat using image-based vi-
sual servoing. In 2021 IEEE International Conference
on Robotics and Biomimetics (ROBIO), pages 1848–
1854.
Zhang, H.-T., Hu, B.-B., Xu, Z., Cai, Z., Liu, B., Wang,
X., Geng, T., Zhong, S., and Zhao, J. (2021). Visual
navigation and landing control of an unmanned aerial
vehicle on a moving autonomous surface vehicle via
adaptive learning. IEEE Transactions on Neural Net-
works and Learning Systems, 32(12):5345–5355.
ICINCO 2024 - 21st International Conference on Informatics in Control, Automation and Robotics
554