look into possibilities to not only add more models or
retrain existing models but also to fine-tune the exist-
ing neural networks or maybe even specific layers to
use this algorithm in incremental learning tasks. In
future research, it would make sense to add support
for unstructured data.
ACKNOWLEDGEMENTS
This work was funded by the Federal Ministry of Ed-
ucation and Research under 16-DHB-4021.
REFERENCES
Aljanabi, M., Qutqut, M. H., and Hijjawi, M. (2018). Ma-
chine learning classification techniques for heart dis-
ease prediction: A review. International Journal of
Enigneering and Technology.
Baldi, P., Sadowski, P., and Whiteson, D. (2014). Searching
for exotic particles in high-energy physics with deep
learning. Nature Communications.
Bertin-Mahieux, T., Ellis, D. P., Whitman, B., and Lamere,
P. (2011). The million song dataset. In Proceedings of
the 12th International Conference on Music Informa-
tion Retrieval (ISMIR 2011).
Bikmukhametov, T. and Jaeschke, J. (2019). Oil production
monitoring using gradient boosting machine learning
algorithm. 12th IFAC Symposium on Dynamics and
Control of Process Systems.
Chen, A. S., Huang, S.-W., Hong, P. S., Cheng, C., and Lin,
E. J. (2011). Hdps: Heart disease predictions system.
Computing in Cardiology.
Chen, T. and Guestrin, C. (2016). Xgboost: A scalable tree
boosting system. KDD 16.
Dangare, C. S. and Apte, S. S. (2012). A data mining ap-
proach for predictions of heart disease using neural
networks. International Journal of Computer Engi-
neering and Technology, pages 30–40.
Deboleena, R., Priyadarshini, P., and Kaushik, R. (2018).
Tree-cnn: A hierarchical deep convolutional neural
network for incremental learning.
Detrano, R., Janosi, A., Steinbrunn, W., Pfisterer, M.,
Schmid, J.-J., Sandhu, S., Kern H. Guppy, S. L., and
Froehlicher, V. (1989). International application of a
new probability algorithm for the diagnosis of coro-
nary artery diseas. The American Journal of Cardiol-
ogy.
Dorogush, A. V., Ershov, V., and Gulin, A. (2018). Cat-
boost: gradient boosting with categorical features sup-
port. Conference on Neural Information Processing
Systems (NeurIPS 2018).
Fanaee-T, H. and Gama, J. (2014). Event labeling combin-
ing ensemble detectors and background knowledge.
Progress in Artificial Intelligence, pages 113–127.
Friedman, J. H. (1999a). Greedy function approximation: A
gradient boosting machine. The Annals of Statistics.
Friedman, J. H. (1999b). Stochastic gradient boosting.
Computational Statistics & Data Analysis, 38.
Graf, F., Kriegel, H.-P., Schubert, M., P
¨
olsterl, S., and Cav-
allaro, A. (2011). 2d image registration in ct images
using radial image descriptors. In Fichtinger, G., Mar-
tel, A., and Peters, T., editors, Medical Image Com-
puting and Computer-Assisted Intervention – MICCAI
2011, pages 607–614, Berlin, Heidelberg. Springer
Berlin Heidelberg.
Kaeding, C., Rodner, E., Freytag, A., and Denzler, J.
(2017). Fine-tuning deep neural networks in contin-
uous learning scenarios. In Computer Vision - ACCV
2016 Workshops, pages 588–605.
Martinez-Munoz, G. (2019). Sequential training of neural
networks with gradient boosting.
Muhlbaier, M., Topalis, A., and Polikar, R. (2004).
Learn++.mt: A new approach to incremental learning.
volume 3077, pages 52–61.
Parmar, A., Mistree, K., and Sompura, M. (2017). Machine
learning techniques for rainfall prediction: A review.
In International Conference on Innovations in infor-
mation Embedded and Communication Systems.
Raskutti, G., Wainwright, M. J., and Yu, B. (2013). Early
stopping and non-parametric regression: An optimal
data-dependent stopping rule.
Sabarinathan, V. and Sugumaran, V. (2014). Diagnosis of
heart disease using decision trees. International Jour-
nal of research in Computer Applications and Infor-
mation Technology, pages 74–79.
Sabay, A., Harris, L., Bejugama, V., and Jaceldo-Siegl, K.
(2018). Overcoming small data limitations in heart
disease prediction by using surrogate data. SMU Data
Science Review, 1(3).
Schwenk, H. and Bengio, Y. (2000). Boosting neural net-
works. Neural Computation.
Shalev-Shwartz, S. (2014). Selfieboost: A boosting algo-
rithm for deep learning.
Shapire, R. E. (1990). The strength of weak learnability.
Machine Learning, pages 197–227.
Tanno, R., Arulkumaran, K., Alexander, D., Criminisi, A.,
and Nori, A. (2018). Adaptive neural trees. 7th Inter-
national Conference on Advanced Technologies.
Viola, P. and Jones, M. (2001). Rapid object detection using
a boosted cascade of simple features. Conference on
Computer Vision and Pattern Recognition.
Zriqat, E., Altamimi, A., and Azzeh, M. (2016). A com-
parative study for predicting heart diseases using data
mining classification methods. International Journal
of Computer Science and Information Security, 12.
Multiple Additive Neural Networks: A Novel Approach to Continuous Learning in Regression and Classification
547