Citation:
Figueiras Vidal, Anibal Ramon; Ahachad, Anas; Alvarez Perez, Lorena (2017). Pre-emphasizing Binarized Ensembles to Improve Classification Performance. IWANN 2017: Advances in Computational Intelligence. : Pp. 339-350
ISBN:
978-3-319-59152-0
xmlui.dri2xhtml.METS-1.0.item-contributor-funder:
Comunidad de Madrid Ministerio de Economía y Competitividad (España)
Sponsor:
This work has been partly supported by research grants CASI-CAM-CM (S2013/ICE-2845, DGUI-CM and FEDER) and Macro-ADOBE (TEC2015-67719-P, MINECO).
Serie/No.:
Lecture Notes in Computer Science
Project:
Gobierno de España. TEC2015-67719-P Comunidad de Madrid. S2013/ICE-2845
Machine ensembles are learning architectures that offer high expressive capacities and, consequently, remarkable performances. This is due to their high number of trainable parameters.In this paper, we explore and discuss whether binarization techniques are efMachine ensembles are learning architectures that offer high expressive capacities and, consequently, remarkable performances. This is due to their high number of trainable parameters.In this paper, we explore and discuss whether binarization techniques are effective to improve standard diversification methods and if a simple additional trick, consisting in weighting the training examples, allows to obtain better results. Experimental results, for three selected classification problems, show that binarization permits that standard direct diversification methods (bagging, in particular) achieve better results, obtaining even more significant performance improvements when pre-emphasizing the training samples. Some research avenues that this finding opens are mentioned in the conclusions.[+][-]
Description:
14th International Work-Conference on Artificial Neural Networks, IWANN 2017