Valls, José M.Aler, RicardoFernández, Óscar2009-12-142009-12-142005-06-21Computational intelligence and bioinspired systems, Springer, Junio 2005, p. 257-263978-3-540-26208-40302-9743 (Print)1611-3349 (Online)https://hdl.handle.net/10016/6033Proceeding of: International Work-Conference on Artificial Neural Networks (IWANN 2005)Radial Basis Neural Networks (RBNN) can approximate any regular function and have a faster training phase than other similar neural networks. However, the activation of each neuron depends on the euclidean distance between a pattern and the neuron center. Therefore, the activation function is symmetrical and all attributes are considered equally relevant. This could be solved by altering the metric used in the activation function (i.e. using non-symmetrical metrics). The Mahalanobis distance is such a metric, that takes into account the variability of the attributes and their correlations. However, this distance is computed directly from the variance-covariance matrix and does not consider the accuracy of the learning algorithm. In this paper, we propose to use a generalized euclidean metric, following the Mahalanobis structure, but evolved by a Genetic Algorithm (GA). This GA searches for the distance matrix that minimizes the error produced by a fixed RBNN. Our approach has been tested on two domains and positive results have been observed in both cases.application/pdfeng© SpringerRadial Basis Neural NetworksUsing a Mahalanobis-like distance to train Radial Basis Neural Networksconference paper10.1007/11494669_32open access257263Computational intelligence and bioinspired systems