Simplified Information Acquisition Method to Improve Prediction Performance: Direct Use of Hidden Neuron Outputs and Separation of Information Acquisition and Use Phase

Ryotaro Kamimura

2014

Abstract

In this paper, we propose a new type of information-theoretic method to improve prediction performance in supervised learning with two main technical features. First, the complicated procedures to increase information content is replaced by the direct use of hidden neuron outputs. We realize higher information by directly changing the outputs from hidden neurons. In addition, we have had difficulty in increasing information content and at the same time decreasing errors between targets and outputs. To cope with this problem, we separate information acquisition and use phase learning. In the information acquisition phase, the auto-encoder tries to acquire information content on input patterns as much as possible. In the information use phase, information obtained in the phase of information acquisition is used to train supervised learning. The method is a simplified version of actual information maximization and directly deals with the outputs from neurons. We applied the method to the protein classification problem. Experimental results showed that our simplified information acquisition method was effective in increasing the real information content. In addition, by using the information content, prediction performance was greatly improved.

References

  1. R. Linsker, “Self-organization in a perceptual network,” Computer, vol. 21, pp. 105-117, 1988.
  2. R. Linsker, “How to generate ordered maps by maximizing the mutual information between input and output,” Neural Computation, vol. 1, pp. 402-411, 1989.
  3. R. Linsker, “Local synaptic rules suffice to maximize mutual information in a linear network,” Neural Computation, vol. 4, pp. 691-702, 1992.
  4. R. Linsker, “Improved local learning rule for information maximization and related applications,” Neural Networks, vol. 18, pp. 261-265, 2005.
  5. R. Kamimura and S. Nakanishi, “Improving generalization performance by information minimization,” IEICE Transactions on Information and Systems, vol. E'-D, no. 2, pp. 163-173, 1995.
  6. R. Kamimura and S. Nakanishi, “Hidden information maximization for feature detection and rule discovery,” Network, vol. 6, pp. 577-622, 1995.
  7. R. Kamimura and T. Kamimura, “Structural information and linguistic rule extraction,” in Proceedings of ICONIP, pp. 720-726, 2000.
  8. G. E. Hinton, “Learning multiple layers of representation,” Trends in cognitive sciences, vol. 11, no. 10, pp. 428-434, 2007.
  9. Y. Bengio, “Learning deep architectures for ai,” Foundations and trends R in Machine Learning, vol. 2, no. 1, pp. 1-127, 2009.
  10. G. E. Hinton and R. R. Salakhutdinov, “Reducing the dimensionality of data with neural networks,” Science, vol. 313, no. 5786, pp. 504-507, 2006.
  11. G. E. Hinton, S. Osindero, and Y.-W. Teh, “A fast learning algorithm for deep belief nets,” Neural computation, vol. 18, no. 7, pp. 1527-1554, 2006.
  12. K. Bache and M. Lichman, “UCI machine learning repository,” 2013.
  13. I. Guyon and A. Elisseeff, “An introduction to variable and feature selection,” Journal of Machine Learning Research, vol. 3, pp. 1157-1182, 2003.
  14. A. Rakotomamonjy, “Variable selection using SVM-based criteria,” Journal of Machine Learning Research, vol. 3, pp. 1357-1370, 2003.
  15. S. Perkins, K. Lacker, and J. Theiler, “Grafting: Fast, incremental feature selection by gradient descent in function space,” Journal of Machine Learning Research, vol. 3, pp. 1333-1356, 2003.
Download


Paper Citation


in Harvard Style

Kamimura R. (2014). Simplified Information Acquisition Method to Improve Prediction Performance: Direct Use of Hidden Neuron Outputs and Separation of Information Acquisition and Use Phase . In Proceedings of the International Workshop on Artificial Neural Networks and Intelligent Information Processing - Volume 1: ANNIIP, (ICINCO 2014) ISBN 978-989-758-041-3, pages 78-87. DOI: 10.5220/0005134700780087


in Bibtex Style

@conference{anniip14,
author={Ryotaro Kamimura},
title={Simplified Information Acquisition Method to Improve Prediction Performance: Direct Use of Hidden Neuron Outputs and Separation of Information Acquisition and Use Phase},
booktitle={Proceedings of the International Workshop on Artificial Neural Networks and Intelligent Information Processing - Volume 1: ANNIIP, (ICINCO 2014)},
year={2014},
pages={78-87},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005134700780087},
isbn={978-989-758-041-3},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Workshop on Artificial Neural Networks and Intelligent Information Processing - Volume 1: ANNIIP, (ICINCO 2014)
TI - Simplified Information Acquisition Method to Improve Prediction Performance: Direct Use of Hidden Neuron Outputs and Separation of Information Acquisition and Use Phase
SN - 978-989-758-041-3
AU - Kamimura R.
PY - 2014
SP - 78
EP - 87
DO - 10.5220/0005134700780087