Publication:
Error-Tolerant Computation for Voting Classifiers With Multiple Classes

dc.affiliation.dptoUC3M. Departamento de Ingeniería Telemáticaes
dc.affiliation.grupoinvUC3M. Grupo de Investigación: Network Technologieses
dc.affiliation.institutoUC3M. Instituto Universitario de Estudios de Géneroes
dc.contributor.authorLiu, Shanshan
dc.contributor.authorReviriego Vasallo, Pedro
dc.contributor.authorMontuschi, Paolo
dc.contributor.authorLombardi, Fabrizio
dc.contributor.funderMinisterio de Economía y Competitividad (España)
dc.contributor.funderComunidad de Madrid
dc.date.accessioned2021-02-09T15:43:58Z
dc.date.available2021-02-09T15:43:58Z
dc.date.issued2020-09-21
dc.description.abstractIn supervised learning, labeled data are provided as inputs and then learning is used to classify new observations. Error tolerance should be guaranteed for classifiers when they are employed in critical applications. A widely used type of classifiers is based on voting among instances (referred to as single voter classifiers) or multiple voters (referred to as ensemble classifiers). When the classifiers are implemented on a processor, Time-Based Modular Redundancy (TBMR) techniques are often used for protection due to the inflexibility of the hardware. In TBMR, any single error can be handled at the cost of additional computing either once for detection or twice for correction after detection; however, this technique increases the computation overhead by at least 100%. The Voting Margin (VM) scheme has recently been proposed to reduce the computation overhead of TBMR, but this scheme has only been utilized for k Nearest Neighbors ( k NNs) classifiers with two classes. In this paper, the VM scheme is extended to multiple classes, as well as other voting classifiers by exploiting the intrinsic robustness of the algorithms. k NNs (that is a single voter classifier) and Random Forest (RF) (that is an ensemble classifier) are considered to evaluate the proposed scheme. Using multiple datasets, the results show that the proposed scheme significantly reduces the computation overhead by more than 70% for k NNs with good classification accuracy and by more than 90% for RF in all cases. However, when extended to multiple classes, the VM scheme for k NNs is not efficient for some datasets. In this paper, a new protection scheme referred to as k + 1 NNs is presented as an alternative option to provide efficient protection in those scenarios. In the new scheme, the computation overhead can be further reduced at the cost of allowing a very low percentage of errors that can modify the classification outcome.en
dc.description.sponsorshipThis work was supported in part by the ACHILLES Project PID2019-104207RB-I00 and the Go2Edge network RED2018-102585-T funded by the Spanish Ministry of Economy and Competitivity, in part by the Department of Research and Innovation of Madrid Regional Authority, in part by the EMPATIA-CM Research Project (Reference Y2018/TCS-5046), and in part by NSF under Grants CCF-1953961 and 1812467
dc.identifier.bibliographicCitationS. Liu, P. Reviriego, P. Montuschi and F. Lombardi, "Error-Tolerant Computation for Voting Classifiers With Multiple Classes," in IEEE Transactions on Vehicular Technology, vol. 69, no. 11, pp. 13718-13727, Nov. 2020
dc.identifier.doihttps://dx.doi.org/10.1109/TVT.2020.3025739
dc.identifier.issn0018-9545
dc.identifier.publicationfirstpage13718
dc.identifier.publicationissue11
dc.identifier.publicationlastpage13727
dc.identifier.publicationtitleIEEE Transactions on Vehicular Technology
dc.identifier.publicationvolume69
dc.identifier.urihttps://hdl.handle.net/10016/31890
dc.identifier.uxxiAR/0000026482
dc.language.isoeng
dc.publisherIEEE
dc.relation.projectIDGobierno de España. PID2019-104207RB-I00
dc.relation.projectIDGobierno de España. RED2018-102585-T
dc.relation.projectIDComunidad de Madrid. Y2018/TCS-5046/EMPATIA-CM
dc.rights© 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
dc.rights.accessRightsopen access
dc.subject.ecienciaTelecomunicacioneses
dc.subject.otherMachine learningen
dc.subject.otherVoting classifieren
dc.subject.otherError toleranceen
dc.subject.otherk nearest neighborsen
dc.subject.otherRandom foresten
dc.titleError-Tolerant Computation for Voting Classifiers With Multiple Classesen
dc.typeresearch article*
dc.type.hasVersionAM*
dspace.entity.typePublication
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Liu_etal_Classifiers_TVT_2020_ps.pdf
Size:
1.88 MB
Format:
Adobe Portable Document Format