PARALLEL EVALUATION OF HOPFIELD NEURAL NETWORKS
Antoine Eiche, Daniel Chillet, Sebastien Pillement, Olivier Sentieys
2011
Abstract
Among the large number of possible optimization algorithms, Hopfield Neural Networks (HNN) propose interesting characteristics for an in-line use. Indeed, this particular optimization algorithm can produce solutions in brief delay. These solutions are produced by the HNN convergence which was originally defined for a sequential evaluation of neurons. While this sequential evaluation leads to long convergence time, we assume that this convergence can be accelerated through the parallel evaluation of neurons. However, the original constraints do not any longer ensure the convergence of the HNN evaluated in parallel. This article aims to show how the neurons can be evaluated in parallel in order to accelerate a hardware or multiprocessor implementation and to ensure the convergence. The parallelization method is illustrated on a simple task scheduling problem where we obtain an important acceleration related to the number of tasks. For instance, with a number of tasks equals to 20 the speedup factor is about 25.
References
- Del Balio, R., Tarantino, E., and Vaccaro, R. (1992). A parallel algorithm for asynchronous hopfield neural networks. In IEEE International Workshop on Emerging Technologies and Factory Automation, pages 666 -669.
- Domeika, M. J. and Page, E. W. (1996). Hopfield neural network simulation on a massively parallel machine. Information Sciences, 91(1-2):133 - 145.
- Hopfield, J. and Tank, D. (1985). ”Neural” computation of decisions in optimization problems. Biological cybernetics, 52(3):141-152.
- Kamp, Y. and Hasler, M. (1990). Recursive neural networks for associative memory. John Wiley & Sons, Inc.
- MaÁdziuk, J. (2002). Neural networks for the N-Queens problem: a review. Control and Cybernetics, 31(2):217-248.
- Sidney, J. (1977). Optimal single-machine scheduling with earliness and tardiness penalties. Operations Research, 25(1):62-69.
- Smith, K. (1999). Neural Networks for Combinatorial Optimization: A Review of More Than a Decade of Research. Informs Journal on Computing, 11:15-34.
- Tagliarini, G., Christ, J., and Page, E. (1991). Optimization using neural networks. IEEE Transactions on Computers, 40(12):1347-1358.
- Wang, C., Wang, H., and Sun, F. (2008). Hopfield neural network approach for task scheduling in a grid environment. In Proceedings of the 2008 International Conference on Computer Science and Software Engineering - Volume 04, pages 811-814.
- Wilson, R. C. (2009). Parallel hopfield networks. Neural Computation, 21:831-850.
- Xu, Z., Hu, G., and Kwong, C. (1996). Asymmetric Hopfield-type networks: theory and applications. Neural Networks, 9(3):483-501.
Paper Citation
in Harvard Style
Eiche A., Chillet D., Pillement S. and Sentieys O. (2011). PARALLEL EVALUATION OF HOPFIELD NEURAL NETWORKS . In Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2011) ISBN 978-989-8425-84-3, pages 248-253. DOI: 10.5220/0003682902480253
in Bibtex Style
@conference{ncta11,
author={Antoine Eiche and Daniel Chillet and Sebastien Pillement and Olivier Sentieys},
title={PARALLEL EVALUATION OF HOPFIELD NEURAL NETWORKS},
booktitle={Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2011)},
year={2011},
pages={248-253},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0003682902480253},
isbn={978-989-8425-84-3},
}
in EndNote Style
TY - CONF
JO - Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2011)
TI - PARALLEL EVALUATION OF HOPFIELD NEURAL NETWORKS
SN - 978-989-8425-84-3
AU - Eiche A.
AU - Chillet D.
AU - Pillement S.
AU - Sentieys O.
PY - 2011
SP - 248
EP - 253
DO - 10.5220/0003682902480253