captured by the UAV downwards camera. Experimen-
tal results using real video captured by a quadrotor
UAV illustrate that the proposed algorithm is capable
of determining the main position and orientation of
borders such as water channel margins and different
types of ground patterns.
It is important to discuss the limitations of the line
detection algorithm. First, it is assumed that the im-
ages captured by the downwards camera have a pat-
tern which contains a main orientation. Second, the
proposed algorithm relies on contrast variation in or-
der to detect lines. This is not a strong limitation
since contrast variation is common in several situ-
ations, like river and channel margins, streets, etc.
But, surfaces with week contrast variation are likely
to cause the algorithm to perform poorly. Finally, the
presence of shadows also affect the performance of
the algorithm. For instance, during our experiments
we noticed that shadows of tree trunks would have an
important influence when they were visible in the im-
age frame. Because the UAV is moving, we believe
that the influence of shadows can be minimized with
the use a of a more robust filtering method when cal-
culating the average position and orientation of the
main lines.
This work is part of a research that aims to in-
vestigate the use of UAVs in agricultural and rural
areas. As an immediate next step we intend to inte-
grate the line follower controller and the line detec-
tion algorithm for the quadrotor to fly autonomously.
We also plan on integrating the proposed system to
the multi-robot control scheme we have presented
in (Rampinelli et al., 2010; Brand
˜
ao et al., 2010;
Brand
˜
ao et al., 2015) to have the quadrotors flying in
formation, also cooperating with a ground vehicle.
ACKNOWLEDGEMENTS
The authors thank Hanze University of Applied Sci-
ences and the University of Groningen for the support
given to this research. Dr. Brand
˜
ao thanks CNPq and
Funarbe for support his participation in this project.
Dr. Martins also thank Dr. Marco Wiering for his
valuable advices and guidance, CAPES (a foundation
of the Brazilian Ministry of Education) for the finan-
cial support (Project BRANETEC 011/13 and process
BEX 1603/14-0), and the Federal Institute of Espirito
Santo for the authorization to be on leave to work on
his research in the Netherlands.
REFERENCES
Brand
˜
ao, A., Sarapura, J., Caldeira, E., Sarcinelli-Filho, M.,
and Carelli, R. (2010). Decentralized control of a for-
mation involving a miniature helicopter and a team of
ground robots based on artificial vision. In IEEE Latin
American Robotics Symposium (LARS).
Brand
˜
ao, A. S., Rampinelli, V. T. L., Martins, F. N.,
Sarcinelli-Filho, M., and Carelli, R. (2015). The mul-
tilayer control scheme: A strategy to guide n-robots
formations with obstacle avoidance. J Control, Autom
Electr Syst, 26(3):201–214.
Brand
˜
ao, A. S., Sarcinelli-Filho, M., and Carelli, R. (2013).
High-level underactuated nonlinear control for rotor-
craft machines. In IEEE International Conference on
Mechatronics, Vicenza, It
´
alia.
Martins, F. N., Brand
˜
ao, A. S., and Soneguetti, H. B.
(2015). A Vision-based Line Following Strat-
egy for an Autonomous UAV. Available at
http://youtu.be/gd9LFQkHG28.
Parrot (2014). AR.Drone 2.0 Technical Specifications.
Available at http://ardrone2.parrot.com/ardrone-
2/specifications/.
Raffo, G. V., Ortega, M. G., and Rubio, F. R. (2010). An
integral predictive/nonlinear H
∞
control structure for
a quadrotor helicopter. Automatica, 46:29–39.
Raffo, G. V., Ortega, M. G., and Rubio, F. R. (2011).
Nonlinear h-infinity controller for the quad-rotor he-
licopter with input coupling. In Proceedings of the
18th IFAC World Congress, volume 18, pages 13834–
13839.
Rampinelli, V., Brand
˜
ao, A., Sarcinelli-Filho, M., Martins,
F., and Carelli, R. (2010). Embedding obstacle avoid-
ance in the control of a flexible multi-robot formation.
In IEEE Int. Symp. on Industrial Electronics (ISIE),
pages 1846–1851.
Roebuck, K. (2012). Location-Based Services (LBS): High-
impact Strategies-What You Need to Know: Defini-
tions, Adoptions, Impact, Benefits, Maturity, Vendors.
Emereo Publishing.
Sotomayor, J. F., G
´
omez, A. P., and Castillo, A. (2014). Vi-
sual control of an autonomous aerial vehicle for crop
inspection. Revista Polit
´
ecnica, 33(1).
Tokekar, P., Vander Hook, J., Mulla, D., and Isler, V. (2013).
Sensor planning for a symbiotic uav and ugv system
for precision agriculture. Technical Report - Dep.
Comp. Science and Eng., University of Minnesota.
Venugopalan, T., Taher, T., and Barbastathis, G. (2012). Au-
tonomous landing of an unmanned aerial vehicle on an
autonomous marine vehicle. In Oceans, 2012.
Warren, M., Corke, P., and Upcroft, B. (2015). Long-range
stereo visual odometry for extended altitude flight of
unmanned aerial vehicles. The International Journal
of Robotics Research.
AVision-basedLineFollowingStrategyforanAutonomousUAV
319