A CONNECTIONIST APPROACH
IN BAYESIAN CLASSIFICATION
Luminita State
Department of Computer Science, University of Pitesti, Pitesti, Romania
Catalina Cocianu
Department of Computer Science, Academy of Economic Studies, Bucharest, Romania
Panayiotis Vlamos
Department of Computer Science, Ionian University, Corfu, Greece
Viorica Stefanescu
Department of Mathematics, Academy of Economic Studies, Bucharest, Romania
Keywords: Hidden Markov Models, learning by examples, Bayesian classification, training algorithm, neural
computation.
Abstract: The research reported in the paper aims the development of a suitable neural architecture for implementing
the Bayesian procedure in solving pattern recognition problems. The proposed neural system is based on an
inhibitive competition installed among the hidden neurons of the computation layer. The local memories of
the hidden neurons are computed adaptively according to an estimation model of the parameters of the
Bayesian classifier. Also, the paper reports a series of qualitative attempts in analyzing the behavior of a
new learning procedure of the parameters an HMM by modeling different types of stochastic dependencies
on the space of states corresponding to the underlying finite automaton. The approach aims the development
of some new methods in processing image and speech signals in solving pattern recognition problems.
Basically, the attempts are stated in terms of weighting processes and deterministic/non deterministic
Bayesian procedures.
1 PRELIMINARIES
Stochastic models represent a very promising
approach to temporal pattern recognition. An
important class of the stochastic models is based on
Markovian state transition, two of the typical
examples being the Markov model (MM) and the
Hidden Markov Model (HMM). In a Markov model,
the transition between states is governed by the
transition probabilities, that is, the state sequence is
a Markov process and the observable state is then
directly observed as the output feature. However,
usually, there are two sorts of variable to be taken
into consideration, namely the manifest variables
which can be directly observed and latent variables
that are hidden to the observer. The HMM model is
based on a doubly stochastic process, one producing
an (unobservable) state and another producing an
observable feature sequence.
The doubly stochastic process is useful in coping
with unpredictable variation of the observed patterns
and its design requires a learning phase when the
parameters of both, the state transition and emission
distributions have to be estimated from the observed
data. The trained HMM can be then used for the
retrieving (recognition) phase when the test
sequence (complete or incomplete) observations
have to be recognized.
The latent structure of observable phenomenon is
modeled in terms of a finite automaton Q, the
observable variable being thought as the output
185
State L., Cocianu C., Vlamos P. and Stefanescu V. (2007).
A CONNECTIONIST APPROACH IN BAYESIAN CLASSIFICATION.
In Proceedings of the Ninth International Conference on Enterprise Information Systems - AIDSS, pages 185-190
DOI: 10.5220/0002346401850190
Copyright
c
SciTePress