Valls, José M.Galván, Inés M.Isasi, Pedro2009-06-032009-06-032007AI Communications 2007, vol. 20, n. 2, p. 71-860921-7126 (Print)1875-8452 (Online)https://hdl.handle.net/10016/4340In the domain of inductive learning from examples, usually, training data are not evenly distributed in the input space. This makes global and eager methods, like Neural Networks, not very accurate in those cases. On the other hand, lazy methods have the problem of how to select the best examples for each test pattern. A bad selection of the training patterns would lead to even worse results. In this work, we present a way of performing a trade-off between local and non-local methods using a lazy strategy. On one hand, a Radial Basis Neural Network is used as learning algorithm; on the other hand, a selection of training patterns is performed for each query in a local way. The selection of patterns is based on the analysis of the query neighborhood, to forecast the size and elements of the best training set for that query. Moreover, the RBNN initialization algorithm has been modifie in a deterministic way to eliminate any initial condition influence The method has been validated in three domains, one artificia and two time series problems, and compared with traditional lazy methods.application/pdfeng© IOS PressLazy learningLocal learningRadial Basis Neural NetworksPattern selectionLRBNN: a Lazy Radial Basis Neural modelresearch articleInformáticaopen access712AI Communications20