cases are shown in Figure 12. In this Figure, the
shown images are the cropped zones of the geo-image
in order to enhance the visibility. However, the im-
ages’ intensity is not changed so as to reflect the true
illumination condition of the geo-image.
Figure 13: Estimation error with the INS-Landstel fusion.
In contrast, with the same trial, the INS-Landstel
fusion can avoid these two errors as shown in Figure
13. With the motion estimation of the inertial sen-
sor, the research zone within the geo-image is well
focused, which reduces the probability of false match-
ing occurrences.
6 CONCLUSIONS
In this paper, we have demonstrated the ability of a
vision-based algorithm coupled with the inertial sen-
sor for spacecraft absolute localization with respect
to an orbiter image. Similarly to an INS-GPS fusion
problem, the advantages obtained are twofold. Firstly,
the localization precision is higher. Secondly, the re-
search zone within the geo-image for the Landstel al-
gorithm is greatly reduced, which both enhances the
algorithm’s speed and reduces the probability of false
matches.
However, the fusion mechanism introduced in this
paper only exchanges the position (both global and
relative) information of the two sensors. A tighter in-
tegration of the two sensors with respect to the inter-
est points detected by both sensors is currently being
analysed and evaluated. First results have shown a
promising application of this type of integration.
REFERENCES
Astrium, E., Avionica, G., of Dundee, U., INETI, and
SCISYS (2006). Navigation for planetary approach
& landing. ESA Contract.
Belongie, S. and Malik, J. (2000). Matching with shape
context. IEEE Workshop on Context Based Access of
Image and Video Libraries.
Cheng, Y. and Ansar, A. (2005). Landmark based posi-
tion estimation for pinpoint landing on mars. Pro-
ceedings of the 2005 IEEE International Conference
on Robotics and Automation, pages 1573 – 1578.
Dufournaud, Y., Schmid, C., and Horaud, R. (2002). Image
matching with scale adjustment. INRIA Report.
Harris, C. and Stephens, M. (1988). A combined corner
and edge detector. Proceedings of the 4th Alvey Vision
Conference.
Janscheck, K., Techernykh, V., and Beck, M. (2006). Per-
formance analysis for visual planetary landing navi-
gation using optical flow and dem matching. AIAA
Guidance, Navigation and Control.
Knocke, P. C., Wawrzyniak, G. G., Kennedy, B. M., and
Parker, T. J. (2004). Mars exploration rovers landing
dispersion analysis. AIAA/AAS Astrodynamics Spe-
cialist Conference and Exhibit.
Parkes, S., Martin, I., Dunstan, M., and Matthews,
D. (2004). Planet surface simulation with pangu.
SpaceOps.
Pham, B. V., Lacroix, S., Devy, M., Drieux, M., and
Philippe, C. (2009). Visual landmark constellation
matching for spacecraft pinpoint landing. AIAA Guid-
ance, Navigation and Control.
Trawny, N., Mourikis, A. I., and Roumeliotis, S. I. (2007).
Coupled vision and inertial navigation for pin-point
landing. NASA Science Technology Conference.
Trawny, N., Mourikis, A. I., Roumeliotis, S. I., Johnson,
A. E., and Montgomery, J. (2006). Vision-aided in-
ertial navigation for pin-point landing using obser-
vations for mapped landmarks. Journal of Fields
Robotics.
VISAPP 2010 - International Conference on Computer Vision Theory and Applications
274