An Alternative to Restricted-Boltzmann Learning for Binary Latent Variables based on the Criterion of Maximal Mutual Information

David Edelman

Abstract

The latent binary variable training problem used in the pre-training process for Deep Neural Networks is approached using the Principle (and related Criterion) of Maximum Mutual Information (MMI). This is presented as an alternative to the most widely-accepted ’Restricted Boltzmann Machine’ (RBM) approach of Hinton. The primary contribution of the present article is to present the MMI approach as the arguably more logically ’natural’ and logically simple means to the same ends. Additionally, the relative ease and effectiveness of the approach for application will be demonstrated for an example case.

Download


Paper Citation


in Harvard Style

Edelman D. (2019). An Alternative to Restricted-Boltzmann Learning for Binary Latent Variables based on the Criterion of Maximal Mutual Information.In Proceedings of the 11th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART, ISBN 978-989-758-350-6, pages 865-868. DOI: 10.5220/0007618608650868


in Bibtex Style

@conference{icaart19,
author={David Edelman},
title={An Alternative to Restricted-Boltzmann Learning for Binary Latent Variables based on the Criterion of Maximal Mutual Information},
booktitle={Proceedings of the 11th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART,},
year={2019},
pages={865-868},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0007618608650868},
isbn={978-989-758-350-6},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 11th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART,
TI - An Alternative to Restricted-Boltzmann Learning for Binary Latent Variables based on the Criterion of Maximal Mutual Information
SN - 978-989-758-350-6
AU - Edelman D.
PY - 2019
SP - 865
EP - 868
DO - 10.5220/0007618608650868