Publication:
A new boosting design of Support Vector Machine classifiers

Loading...
Thumbnail Image
Identifiers
Publication date
2015-09
Defense date
Advisors
Tutors
Journal Title
Journal ISSN
Volume Title
Publisher
Elsevier
Impact
Google Scholar
Export
Research Projects
Organizational Units
Journal Issue
Abstract
Boosting algorithms pay attention to the particular structure of the training data when learning, by means of iteratively emphasizing the importance of the training samples according to their difficulty for being correctly classified. If common kernel Support Vector Machines (SVMs) are used as basic learners to construct a Real AdaBoost ensemble, the resulting ensemble can be easily compacted into a monolithic architecture by simply combining the weights that correspond to the same kernels when they appear in different learners, avoiding to increase the operation computational effort for the above potential advantage. This way, the performance advantage that boosting provides can be obtained for monolithic SVMs, i.e., without paying in classification computational effort because many learners are needed. However, SVMs are both stable and strong, and their use for boosting requires to unstabilize and to weaken them. Yet previous attempts in this direction show a moderate success. In this paper, we propose a combination of a new and appropriately designed subsampling process and an SVM algorithm which permits sparsity control to solve the difficulties in boosting SVMs for obtaining improved performance designs. Experimental results support the effectiveness of the approach, not only in performance, but also in compactness of the resulting classifiers, as well as that combining both design ideas is needed to arrive to these advantageous designs.
Description
Keywords
Real adaboost, Subsampling, Support vector machines, Linear programming, Ensemble classifiers
Bibliographic citation
Mayhua-López, E., Gómez-Verdejo, V. & Figueiras-Vidal, A. R. (2015). A new boosting design of Support Vector Machine classifiers. Information Fusion, 25, 63–71.