Publication:
Framework for the Classification of Emotions in People With Visual Disabilities Through Brain Signals

dc.affiliation.dptoUC3M. Departamento de Informáticaes
dc.affiliation.grupoinvUC3M. Grupo de Investigación: Human Language and Accessibility Technologies (HULAT)es
dc.contributor.authorLópez Hernández, Jesús Leonardo
dc.contributor.authorGonzález Carrasco, Israel
dc.contributor.authorLópez Cuadrado, José Luis
dc.contributor.authorRuiz Mezcua, María Belén
dc.date.accessioned2022-02-24T08:26:34Z
dc.date.available2022-02-24T08:26:34Z
dc.date.issued2021-05-07
dc.description.abstractNowadays, the recognition of emotions in people with sensory disabilities still represents a challenge due to the difficulty of generalizing and modeling the set of brain signals. In recent years, the technology that has been used to study a person’s behavior and emotions based on brain signals is the brain–computer interface (BCI). Although previous works have already proposed the classification of emotions in people with sensory disabilities using machine learning techniques, a model of recognition of emotions in people with visual disabilities has not yet been evaluated. Consequently, in this work, the authors present a twofold framework focused on people with visual disabilities. Firstly, auditory stimuli have been used, and a component of acquisition and extraction of brain signals has been defined. Secondly, analysis techniques for the modeling of emotions have been developed, and machine learning models for the classification of emotions have been defined. Based on the results, the algorithm with the best performance in the validation is random forest (RF), with an accuracy of 85 and 88% in the classification for negative and positive emotions, respectively. According to the results, the framework is able to classify positive and negative emotions, but the experimentation performed also shows that the framework performance depends on the number of features in the dataset and the quality of the Electroencephalogram (EEG) signals is a determining factor.en
dc.description.sponsorshipThis work was supported by the National Council of Science and Technology of Mexico (CONACyT), through grant number 709656.en
dc.format.extent18
dc.identifier.bibliographicCitationLópez-Hernández, J. L., González-Carrasco, I., López-Cuadrado, J. L., & Ruiz-Mezcua, B. (2021). Framework for the Classification of Emotions in People With Visual Disabilities Through Brain Signals. Frontiers in Neuroinformatics, 15.en
dc.identifier.doihttps://doi.org/10.3389/fninf.2021.642766
dc.identifier.issn1662-5196
dc.identifier.publicationfirstpage1
dc.identifier.publicationlastpage18
dc.identifier.publicationtitleFrontiers in Neuroinformatics (Frontiers in Neuroinformatics)en
dc.identifier.publicationvolume15
dc.identifier.urihttps://hdl.handle.net/10016/34228
dc.identifier.uxxiAR/0000028639
dc.language.isoeng
dc.publisherFrontiers Media
dc.rightsCopyright © 2021 López-Hernández, González-Carrasco, López-Cuadrado and Ruiz-Mezcua.en
dc.rightsAtribución 3.0 España*
dc.rights.accessRightsopen accessen
dc.rights.urihttp://creativecommons.org/licenses/by/3.0/es/*
dc.subject.ecienciaInformáticaes
dc.subject.otherAffective computingen
dc.subject.otherBrain-computer interfaceen
dc.subject.otherEmotion classification algorithmen
dc.subject.otherMachine learningen
dc.subject.otherVisual disabilitiesen
dc.titleFramework for the Classification of Emotions in People With Visual Disabilities Through Brain Signalsen
dc.typeresearch article*
dc.type.hasVersionVoR*
dspace.entity.typePublication
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Framework_FIN_2021.pdf
Size:
2.52 MB
Format:
Adobe Portable Document Format