A Vision System for Autonomous Satellite Grapple
with Attitude Thruster
Haidong Hu, Xiaoyan Mao, Zhibin Zhu, Chunling Wei and Yingzi He
Beijing Institute of Control Engineering, Haidian District, Beijing, China
China Academy of Space Technology, Haidian District, Beijing, China
Keywords: Vision System, Satellite Grapple, Binocular Cameras.
Abstract: This paper describles an experiment of a binocular vision-based system for positioning the thruster nozzle
on the satellite mockup. Images of the thruster are obtained using two cameras in order to determine the
thruster’s 3D position. At the beginning the thruster is selected manually, and then a local image region is
extracted from the raw image. Subsequently, a Canny detector algorithm is used in the local image region to
acquire the edge roadmap, and a Hough Transform algorithm is performed to detect the features with
circles. Then, a curving fitting method is employed to determine the position of the center of the thruster
nozzle., The end effector keeps tracking the target thruster to the distance of 0.1 meter and grasps the target
by prodicting its trajectory. The experiment has shown that the system is robust to camera/target relative
motions and performs approaching and grappling procedures on satellite mockup successfully.
1 INTRODUCTION
The benefits of on-orbit satellite servicing include
satellite refueling, satellite life extension, debris
removal, repair and salvage. The robotic systems
plays an important role in satellite servicing and
satellite capture is a critical phase for enabling
service operations. In the satellite operations the
servicing vessel approaches the target satellite to a
distance of about 2m. Then a robotic manipulator is
used to autonomously capture the target satellite and
perform the docking operation with the vehicle.
In last years several space missions have tested
the satellite capture technology. In 2007 DARPA
OrbitalExpress tested the rendezvous, approach,
docking and servicing, including transfers of
hydrazine fuel, and battery and flight computer
orbital replacement units (Leinz 2008). The
demonstration system consisted of two satellites, i.e.
the ASTRO and the NextSat/CSC. ASTRO is the
active (chaser) vehicle with the NextSat/CSC as the
passive (target) vehicle. In this mission, the
advanced video guidance sensor (AVGS) laser-
based tracking system was employed to provide
target attitude, range, and bearing during the
chaser’s short-range proximity maneuvering and
docking operations that occurs in the last few
hundred meters of flight down the approach corridor.
The AVGS was designated to be an autonomous
docking sensor using the reflectors which were
equipped on the target. Also, the DARPA sponsored
the FREND program to prove the capability of
autonomously executing an unaided grapple of a
spacecraft which was never designed to be serviced
(Debus 2009). The FREND program was developed
and demonstrated a flight robotic arm system with
its associated avionics, end-effector, and algorithms.
The EEVS (End Effector Vision System) was used
to guide the arm into a hardpoint and position the
figers for a solid grapple. The EEVS used three
visible cameras mounted near the end of the robotic
arm. In 2011 DARPA sponsored the Phoenix Plan
program to develop the technologies to coopera-
tively harvest and reuse valuable components from
retired, non-working satellites in GEO and
demonstrate the ability to create new space systems
at greatly reduced cost (David 2013). One of the
most difficult problems to solve on the Phoenix Plan
will be rendezvous and proximity operations through
grappling of a retired satellite with properties that
may be unknown. The Phoenix team undertook a
test campaign to test a variety of LiDAR and optical
sensors (e.g. stereo camera CCD sensor) for use
during RPO maneuvers in the Phoenix mission.
Obviously, many satellites were not designed
with servicing capabilities and do not have sensors
333
Hu H., Mao X., Zhu Z., Wei C. and He Y..
A Vision System for Autonomous Satellite Grapple with Attitude Thruster.
DOI: 10.5220/0005020203330337
In Proceedings of the 11th International Conference on Informatics in Control, Automation and Robotics (ICINCO-2014), pages 333-337
ISBN: 978-989-758-040-6
Copyright
c
2014 SCITEPRESS (Science and Technology Publications, Lda.)