RT Conference Proceedings T1 Using a Mahalanobis-like distance to train Radial Basis Neural Networks A1 Valls, José M. A1 Aler, Ricardo A1 Fernández, Óscar AB Radial Basis Neural Networks (RBNN) can approximate any regular function and have a faster training phase than other similar neural networks. However, the activation of each neuron depends on the euclidean distance between a pattern and the neuron center. Therefore, the activation function is symmetrical and all attributes are considered equally relevant. This could be solved by altering the metric used in the activation function (i.e. using non-symmetrical metrics). The Mahalanobis distance is such a metric, that takes into account the variability of the attributes and their correlations. However, this distance is computed directly from the variance-covariance matrix and does not consider the accuracy of the learning algorithm. In this paper, we propose to use a generalized euclidean metric, following the Mahalanobis structure, but evolved by a Genetic Algorithm (GA). This GA searches for the distance matrix that minimizes the error produced by a fixed RBNN. Our approach has been tested on two domains and positive results have been observed in both cases. PB Springer SN 978-3-540-26208-4 SN 0302-9743 (Print) SN 1611-3349 (Online) YR 2005 FD 2005-06-21 LK https://hdl.handle.net/10016/6033 UL https://hdl.handle.net/10016/6033 LA eng NO Proceeding of: International Work-Conference on Artificial Neural Networks (IWANN 2005) DS e-Archivo RD 19 may. 2024