Publication:
Using a Mahalanobis-like distance to train Radial Basis Neural Networks

Loading...
Thumbnail Image
Identifiers
ISSN: 0302-9743 (Print)
ISSN: 1611-3349 (Online)
ISBN: 978-3-540-26208-4
Publication date
2005-06-21
Defense date
Advisors
Tutors
Journal Title
Journal ISSN
Volume Title
Publisher
Springer
Impact
Google Scholar
Export
Research Projects
Organizational Units
Journal Issue
Abstract
Radial Basis Neural Networks (RBNN) can approximate any regular function and have a faster training phase than other similar neural networks. However, the activation of each neuron depends on the euclidean distance between a pattern and the neuron center. Therefore, the activation function is symmetrical and all attributes are considered equally relevant. This could be solved by altering the metric used in the activation function (i.e. using non-symmetrical metrics). The Mahalanobis distance is such a metric, that takes into account the variability of the attributes and their correlations. However, this distance is computed directly from the variance-covariance matrix and does not consider the accuracy of the learning algorithm. In this paper, we propose to use a generalized euclidean metric, following the Mahalanobis structure, but evolved by a Genetic Algorithm (GA). This GA searches for the distance matrix that minimizes the error produced by a fixed RBNN. Our approach has been tested on two domains and positive results have been observed in both cases.
Description
Proceeding of: International Work-Conference on Artificial Neural Networks (IWANN 2005)
Keywords
Radial Basis Neural Networks
Bibliographic citation
Computational intelligence and bioinspired systems, Springer, Junio 2005, p. 257-263