Image Feature Significance for Self-position Estimation with Variable Processing Time
Kae Doki, Manabu Tanabe, Akihiro Torii, Akiteru Ueda
2009
Abstract
We have researched about an action planning method of an autonomous mobile robot with a real-time search. In the action planning based on a real-time search, it is necessary to balance the time for sensing and time for action planning in order to use the limited computational resources efficiently. Therefore, we have studied on the sensing method whose processing time is variable and constructed a self-position estimation system with variable processing time as an example of sensing. In this paper, we propose a self-position estimation method of an autonomous mobile robot based on image feature significance. In this method, the processing time for self-position estimation can be varied by changing the number of image features based on its significance. To realize this concept, we conceive the concepts of the significance on image features, and verify three kinds of equations which respectively express the significance of image features.
References
- K.Fujisawa et. al: A real-time search for autonomous mobile robots, J.of the Robotics Society of Japan, Vol.17, No.4, pp.503-512,1999
- K.Doki et.al: Environment Representation Method Suitable for Action Acquisition of an Autonomous Mobile Robot, Proc. of International Conference on Computer,Communication and Control Technologies,Vol.5, pp.283-288,2003
- S.Zilberstein et al.: Anytime Sensing, Planning and Action: A Practical Model for Robot Control, the ACM magazine,1996
- K.Doki et al.: Self-Position Estimation of Autonomous Mobile Robot Considering Variable Processing Time, Proc. of The 11th IASTED International Conference on Artificial Intelligence and Soft Computing, 584-053(CD-ROM), 2007
- T.Taketomi, T.Sato, N.Yokoya: Fast and robust camera parameter estimation using a feature landmark database with priorities of landmarks, Technical report of IEICE. PRMU, Vol.107, No.427, pp.281-286, 2008
- N.Mitsunaga, M.Asada: Visual Attention Control based on nformation Criterion for a Legged Mobile Robot that Observes the Environment while Walking, J. of the Robotics Society of Japan, Vol.21, No.7, pp.819-827, 2003
- Y.Satoh et.al: Robust Image Registration Using Selective Correlation Coefficient, Trans. of IEE Japan, Vol. 121-C, No.4, pp.1-8, 2001
- Nixon, Mark S. , Aquado, Alberto S.: Feature Extraction and Image Processing(2ND), Academic Pr, 2008
- M.Asada et. al: Purposive Behavior Acquisition for a Robot by Vision-Based Reinforcement Learning, Journal of the Robotics Society of Japan, Vol.13, No.1, pp.68-74,1995
- H.Koyasu et.al: Recognizing Moving Obstacles for Robot Navigation Using Real-Time Omni-directional Stereo Vision, J.of Robotics and Mechatronics, Vol.14, No.2, pp.147- 156,2002
Paper Citation
in Harvard Style
Doki K., Tanabe M., Torii A. and Ueda A. (2009). Image Feature Significance for Self-position Estimation with Variable Processing Time . In Proceedings of the 5th International Workshop on Artificial Neural Networks and Intelligent Information Processing - Volume 1: Workshop ANNIIP, (ICINCO 2009) ISBN 978-989-674-002-3, pages 134-142. DOI: 10.5220/0002261001340142
in Bibtex Style
@conference{workshop anniip09,
author={Kae Doki and Manabu Tanabe and Akihiro Torii and Akiteru Ueda},
title={Image Feature Significance for Self-position Estimation with Variable Processing Time},
booktitle={Proceedings of the 5th International Workshop on Artificial Neural Networks and Intelligent Information Processing - Volume 1: Workshop ANNIIP, (ICINCO 2009)},
year={2009},
pages={134-142},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0002261001340142},
isbn={978-989-674-002-3},
}
in EndNote Style
TY - CONF
JO - Proceedings of the 5th International Workshop on Artificial Neural Networks and Intelligent Information Processing - Volume 1: Workshop ANNIIP, (ICINCO 2009)
TI - Image Feature Significance for Self-position Estimation with Variable Processing Time
SN - 978-989-674-002-3
AU - Doki K.
AU - Tanabe M.
AU - Torii A.
AU - Ueda A.
PY - 2009
SP - 134
EP - 142
DO - 10.5220/0002261001340142