Figure 2: An example of an orchard. Blue point is the CP
of corridor and purple point is the VP of the corridor.
platforms can be mounted on even smaller UAVs.
The remainder of this paper is structured as fol-
lows; in Section 2 we relate our method with the cur-
rent literature. In Section 3 we explain how we detect
the vanishing point and center point of the corridor to
correct the yaw and roll of the UAV during flight. In
Section 4 our results are discussed and in Section 5
conclusions are drawn and future work is discussed.
2 RELATED WORK
As discussed in (Pajares, 2015) UAVs are used in a
lot of applications and carrying different sensors to
extract information out of their environment.
UAVs are more and more used to fly over fields in
open space for constructing a (3D)map for precision
agriculture as in (Zarco-Tejada et al., 2014) where
they monitor plant growth. Also diseases can be de-
tected in an early state by a UAV flying over the ter-
rain as in (Garcia-Ruiz et al., 2013) where a hyper-
spectral camera is mounted on the UAV to find ab-
normalities in citrus trees. In (Colomina and Molina,
2014) an overviewis given of different photogramme-
try and remote sensing techniques.
In (Puttemans et al., 2016) software was devel-
oped to detect and count fruit from images taken
from a camera mounted on a wheeled robot for early
harvest estimation. When a more accurate view of
the fruit is needed generally wheeled robots are used
that can drive trough the orchard like in (Christiansen
et al., 2011; Barawid et al., 2007; Andersen et al.,
2010; Hiremath et al., 2014) where they use a LI-
DAR combined with other sensors like GPS to drive
through the orchard. Or in (Rovira-M´as et al., 2008)
where stereo vision is used to make a 3D map of the
orchard with a wheeled robot. The disadvantage of
these robots is that they are all carrying a heavy and
expensive laser scanner. In (Xue et al., 2012) vision-
based techniques are used to find the path between the
trees. Here a simple color segmentation is used to dis-
tinguish the path from the corn plants. Of course, still
a very expensivewheeled robot is needed that requires
frequent maintenance.
Navigating through an orchard with a UAV instead
of a wheeled vehicle has multiple advantages. The
slope and condition of the path is not that important
as when using a wheeled robot. Furthermore, a UAV
flies much faster and the cost of a UAV and its main-
tenance is much lower than with a wheeled robot.
Initial experiments with UAVs flying through an
orchard were already performed by (Verbeke et al.,
2014) where they designed a UAV-frame specifically
to fly in fruit orchards, which can be equipped with a
small computer and cameras. In (Stefas et al., 2016),
they experimented with a monocular and binocular
camera to retrieve the path between the tree rows. Un-
fortunately, their algorithm is based on a traffic lane
detection algorithm and results in a poor classification
of the tree rows.
We developed a new approach to navigate through
an orchard using a cheap webcam and on-board pro-
cessing. Our approach has a high accuracy both in
finding the center and the end of the corridor. No ex-
pensive laser scanner or robot is needed and our sys-
tem can be used in multiple types of orchards.
3 APPROACH
When a human is walking through an orchard he fol-
lows the path to avoid collisions with the trees. Two
actions are taking place; 1. The human tries to stay in
the middle of the path, 2. The human looks to the end
of the corridor to walk in a straight line. The same is
true for a UAV, the roll should be controlled to stay
in the center of the path and the yaw to keep the nose
of the UAV pointing towards the end of the corridor.
Evidently, the pitch is steered at a fixed speed to go
forward and the altitude is maintained stable.
To control these two DOF, the roll and yaw, we de-
veloped an algorithm that estimates the center of the
corridor (CP, center point) and the end of the corri-
dor (VP, vanishing point). The algorithm is designed
to estimate these two points in a computational low-
cost-manner so they are able to run in real-time on
embedded hardware, mounted on the UAV. Figure 3
shows the overall system where the CP and the VP
are found. In section 3.1 we first show how we esti-
mate the CP and in section 3.2 we explain how the VP
is found.