Acquiring Method for Agents’ Actions using Pheromone Communication between Agents

Hisayuki Sasaoka

2013

Abstract

We have known that an algorithm of Ant Colony System (ACS) and Max-Min Ant System (MM-AS) based on ACS are one of powerful meta-heuristics algorithms and some researchers have reported their effectiveness of some applications using then. On the other hand, we have known that the algorithms have some problems when we employed them in multi-agent system and we have proposed a new method which is improved MM-AS. This paper describes some results of evaluation experiments with agents implemented our proposed method. In these experiments, we have used seven maps and scenarios for RoboCup Rescue Simulation system (RCRS). To confirm the effectiveness of our method, we have considered agents’ action for fire-fighting in simulation and their improvements of scores.

References

  1. Gordon, D. M., 1999: Ants at work, THE FREE PRESS, New York.
  2. Keller, L., Gordon, E., 2009: the lives of Ants, Oxford University Press Inc., New York.
  3. Wilson, E. O., Duran, J. G., 2010: Kingdom of ants, The John Hopkins University Press, Baltimore.
  4. Dorigo, M., Maniezzo, V. and Colorni, A., 1996: The Ant System: Optimization by a Colony of Cooperating Agents. IEEE Trans. Syst. Man Cybern. B-26 (1996), pp. 29 - 41,
  5. Dorigo, M., Stützle, T., 2004: Ant Colony Optimization, The MIT Press.
  6. Bonabeau, E., Dorigo, M., Theraulaz, G., 1999: Swarm Intelligence From Naturak ti Artificial Systems, Oxford University Press.
  7. Bonabeau, E., Dorigo, M., Theraulaz, G., 2000: Inspiration for optimization from social insect behaviour, Nature, Vol. 406, Number 6791, pp. 39-42.
  8. Hernandez, H., Blum, C., Moore, J. H., 2008: Ant Colony Optimization for Energy-Efficient Broadcasting in AdHoc Networks, in Proc. 6th International Conference, ANTS 2008, Brussels, pp.25 -36.
  9. D'Acierno, L., Montella, De Lucia, B. F., 2006: A Stochastic Traffic Assignt Algorithm Based Ant Colony Optimisation, in Proc. 5th International Conference, ANTS 2006, Brussels, pp.25 -36.
  10. Balaprakash, P., Birattari, M., Stützle, T., Dorigo,, M., 2008: Estimation-based ant colony optimization and local search for the probabilistic traveling salesman problem, Journal of Swarm Intelligence, vol. 3, Springer, pp. 223-242.
  11. Stützle, T., Hoos, H.H., 2010: MAX-MIN ant system. Future Generation Computer System 16(8), pp.889- 914(2000).
  12. Skinner, C. and Ramchurn, S. 2010: The robocup rescue simulation platform, In Proceedings of the 9th International Conference on Autonomous Agents and Multiagent Systems: volume 1-Volume 1, AAMAS 7810, pp. 1647-1648(2010).
  13. RoboCup Japan Open 2013 competition Official Homepage, http://www.tamagawa.ac.jp/robocup2013/
  14. Sasaoka, H., 2013: Evaluation for Method for Agents' Action Using Pheromone Communication in Multiagent system, pp. 103- 106, J. of Machine Learning and Computing, Vol.3, No1, IACSIT Press (2013).
  15. Nikkei Software (in Japanese), 2011: Vol.14, No.7, Nikkei BP Inc. (2011).
Download


Paper Citation


in Harvard Style

Sasaoka H. (2013). Acquiring Method for Agents’ Actions using Pheromone Communication between Agents . In Proceedings of the 5th International Joint Conference on Computational Intelligence - Volume 1: ECTA, (IJCCI 2013) ISBN 978-989-8565-77-8, pages 91-96. DOI: 10.5220/0004538300910096


in Bibtex Style

@conference{ecta13,
author={Hisayuki Sasaoka},
title={Acquiring Method for Agents’ Actions using Pheromone Communication between Agents},
booktitle={Proceedings of the 5th International Joint Conference on Computational Intelligence - Volume 1: ECTA, (IJCCI 2013)},
year={2013},
pages={91-96},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004538300910096},
isbn={978-989-8565-77-8},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 5th International Joint Conference on Computational Intelligence - Volume 1: ECTA, (IJCCI 2013)
TI - Acquiring Method for Agents’ Actions using Pheromone Communication between Agents
SN - 978-989-8565-77-8
AU - Sasaoka H.
PY - 2013
SP - 91
EP - 96
DO - 10.5220/0004538300910096