loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Authors: Theus H. Aspiras 1 ; Vijayan K. Asari 1 and Wesam Sakla 2

Affiliations: 1 University of Dayon, United States ; 2 Air Force Research Laboratory and Wright Patterson Air Force Base, United States

Keyword(s): Nonlinear Line Attractor, Multidimensional Data, Neural Networks, Machine Learning.

Related Ontology Subjects/Areas/Topics: Artificial Intelligence ; Biomedical Engineering ; Biomedical Signal Processing ; Computational Intelligence ; Data Manipulation ; Health Engineering and Technology Applications ; Human-Computer Interaction ; Image Processing and Artificial Vision Applications ; Methodologies and Methods ; Neural Networks ; Neurocomputing ; Neurotechnology, Electronics and Informatics ; Pattern Recognition ; Physiological Computing Systems ; Sensor Networks ; Signal Processing ; Soft Computing ; Theory and Methods

Abstract: The human brain’s ability to extract information from multidimensional data modeled by the Nonlinear Line Attractor (NLA), where nodes are connected by polynomial weight sets. Neuron connections in this architecture assumes complete connectivity with all other neurons, thus creating a huge web of connections. We envision that each neuron should be connected to a group of surrounding neurons with weighted connection strengths that reduces with proximity to the neuron. To develop the weighted NLA architecture, we use a Gaussian weighting strategy to model the proximity, which will also reduce the computation times significantly. Once all data has been trained in the NLA network, the weight set can be reduced using a locality preserving nonlinear dimensionality reduction technique. By reducing the weight sets using this technique, we can reduce the amount of outputs for recognition tasks. An appropriate distance measure can then be used for comparing testing data and the trained data wh en processed through the NLA architecture. It is observed that the proposed GNLA algorithm reduces training time significantly and is able to provide even better recognition using fewer dimensions than the original NLA algorithm. We have tested this algorithm and showed that it works well in different datasets, including the EO Synthetic Vehicle database and the Sheffield face database. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 54.226.226.30

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
H. Aspiras, T.; K. Asari, V. and Sakla, W. (2015). Gaussian Nonlinear Line Attractor for Learning Multidimensional Data. In Proceedings of the 7th International Joint Conference on Computational Intelligence (ECTA 2015) - NCTA; ISBN 978-989-758-157-1, SciTePress, pages 130-137. DOI: 10.5220/0005597801300137

@conference{ncta15,
author={Theus {H. Aspiras}. and Vijayan {K. Asari}. and Wesam Sakla.},
title={Gaussian Nonlinear Line Attractor for Learning Multidimensional Data},
booktitle={Proceedings of the 7th International Joint Conference on Computational Intelligence (ECTA 2015) - NCTA},
year={2015},
pages={130-137},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005597801300137},
isbn={978-989-758-157-1},
}

TY - CONF

JO - Proceedings of the 7th International Joint Conference on Computational Intelligence (ECTA 2015) - NCTA
TI - Gaussian Nonlinear Line Attractor for Learning Multidimensional Data
SN - 978-989-758-157-1
AU - H. Aspiras, T.
AU - K. Asari, V.
AU - Sakla, W.
PY - 2015
SP - 130
EP - 137
DO - 10.5220/0005597801300137
PB - SciTePress