Valls, José M.Aler, Ricardo2010-01-252010-01-252009-09Intelligent Data Engineering and Automated Learning - IDEAL 2009 : 10th International Conference, Burgos, Spain, September 23-26, 2009. Springer, 2009, pp. 176-183978-3-642-04393-20302-9743 (print)1611-3349 (Online)https://hdl.handle.net/10016/6599Proceeding of: 10th International Conference, IDEAL 2009, Burgos, Spain, September 23-26, 2009Many classification algorithms use the concept of distance or similarity between patterns. Previous work has shown that it is advantageous to optimize general Euclidean distances (GED). In this paper, data transformations are optimized instead. This is equivalent to searching for GEDs, but can be applied to any learning algorithm, even if it does not use distances explicitly. Two optimization techniques have been used: a simple Local Search (LS) and the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). CMA-ES is an advanced evolutionary method for optimization in difficult continuous domains. Both diagonal and complete matrices have been considered. Results show that in general, complete matrices found by CMA-ES either outperform or match both Local Search, and the classifier working on the original untransformed data.application/pdfeng© SpringerData transformationsGeneral Euclidean distancesEvolutionary computationEvolutionary-based machine learningOptimizing data transformations for classification tasksconference paperInformática10.1007/978-3-642-04394-9_22open access176183Intelligent Data Engineering and Automated Learning - IDEAL 20095788