network during the learning process maximizing the
available computational resources.
5 CONCLUSIONS AND FUTURE
WORK
In this work we have presented an adaptation of
the OANN online learning algorithm (P´erez-S´anchez
et al., 2013) to modify the network topology accord-
ing to the needs of the learning process. The network
structure begins with the minimum number of hidden
neurons and a new unit is added whenever the current
topology was not appropriate to satisfy the needs of
the process. Moreover, the method allows saving both
temporal and spatial resources, an important charac-
teristic when it is necessary to handle a large number
of data for training or when the problem is complex
and requires a network with a high number of nodes
for its resolution.
In spite of these favorable characteristics, there
are some aspects that need an in-depth study and will
be addressed as future work. A new line of research
seems adequate in order to limit the addition of hid-
den units employing different measures, as for exam-
ple, the increasing tendency of the errors committed.
Also it could be proposed as an modification of the
method so as to include some pruning technique to
allow, not only the addition, but also the removal of
unnecessary hidden units according to the complexity
of learning.
ACKNOWLEDGEMENTS
The authors would like to acknowledge support for
this work from the Xunta de Galicia (Grant codes
CN2011/007 and CN2012/211), the Secretar´ıa de Es-
tado de Investigaci´on of the Spanish Government
(Grant code TIN2012-37954), all partially supported
by the European Union ERDF.
REFERENCES
Ash, T. (1989). Dynamic node creation in backpropagation
networks. Connection Science, 1(4):365–375.
Aylward, S. and Anderson, R. (1991). An algorithm for
neural network architecture generation. In AIAA Com-
puting in Aerospace Conference VIII.
Baum, E. B. and Haussler, D. (1989). What size net gives
valid generalization? Neural Computation, 1(1):151–
160.
Bishop, C. M. (1995). Neural Networks for Pattern Recog-
nition. Oxford University Press, New York.
Fiesler, E. (1994). Comparative Bibliography of Onto-
genic Neural Networks. In Proccedings of the In-
ternational Conference on Artificial Neural Networks
(ICANN 1994), pages 793–796.
Fontenla-Romero, O., Guijarro-Berdi˜nas, B., P´erez-
S´anchez, B., and Alonso-Betanzos, A. (2010). A new
convex objective function for the supervised learning
of single-layer neural networks. Pattern Recognition,
43(5):1984–1992.
Gama, J., Medas, P., Castillo, G., and Rodrigues, P. (2004).
Learning with drift detection. Intelligent Data Analy-
sis, 8:213–237.
H´enon, M. (1976). A two-dimensional mapping with a
strange attractor. Communications in Mathematical
Physics, 50(1):69–77.
Islam, M., Sattar, A., Amin, F., Yao, X., and Murase, K.
(2009). A new adaptive merging and growing algo-
rithm for designing artificial neural networks. IEEE
Transactions on Neural Networks, 20:1352–1357.
Kwok, T.-Y. and Yeung, D.-Y. (1997). Constructive Algo-
rithms for Structure Learning in FeedForward Neural
Networks for Regression Problems. IEEE Transac-
tions on Neural Networks, 8(3):630–645.
Ma, L. and Khorasani, K. (2003). A new strategy for adap-
tively constructing multilayer feedforward neural net-
works. Neurocomputing, 51:361–385.
Mackey, M. and Glass, L. (1977). Oscillation and
chaos in physiological control sytems. Science,
197(4300):287–289.
Mart´ınez-Rego, D., P´erez-S´anchez, B., Fontenla-Romero,
O., and Alonso-Betanzos, A. (2011). A robust in-
cremental learning method for non-stationary environ-
ments. NeuroComputing, 74(11):1800–1808.
Murata, N. (1994). Network Information Criterion-
Determining the number of hidden units for an Arti-
ficial Neural Network Model. IEEE Transactions on
Neural Networks, 5(6):865–872.
Nguyen, D. and Widrow, B. (1990). Improving the learn-
ing speed of 2-layer neural networks choosing initial
values of the adaptive weights. In Proccedings of the
International Joint Conference on Neural Networks,
(IJCNN 1990), volume 3, pages 21–26.
Parekh, R., Yang, J., and Honavar, V. (2000). Construc-
tive Neural-Network Learning Algorithms for Pattern
Classification.
P´erez-S´anchez, B., Fontenla-Romero, O., Guijarro-
Berdi˜nas, B., and Mart´ınez-Rego, D. (2013). An on-
line learning algorithm for adaptable topologies of
neural networks. Expert Systems with Applications,
40(18):7294–7304.
Reed, R. (1993). Pruning Algorithms: A Survey. IEEE
Transactions on Neural Networks, 4:740–747.
Sharma, S. K. and Chandra, P. (2010). Constructive neural
networks: A review. International Journal of Engi-
neering Science and Technology, 2(12):7847–7855.
Vapnik, V. (1998). Statistical Learning Theory. John Wiley
& Sons, Inc. New York.
Yao, X. (1999). Evolving Artificial Neural Networks. In
Proceedings of the IEEE, volume 87, pages 1423–
1447.
Self-adaptiveTopologyNeuralNetworkforOnlineIncrementalLearning
101