Publication: Inferencia de emociones a través de detección corporal y facial
Loading...
Identifiers
Publication date
2012-10
Defense date
2012-10-19
Authors
Advisors
Tutors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
La emoción, del latín emotĭo (impulso) es la reacción a un estímulo externo a partir de una experiencia previa, además de la base de la comunicación no verbal. Se está produciendo un cambio en los paradigmas de interacción entre las personas y los ordenadores (de su estudio se ocupa la rama conocida como HCI, por sus siglas en inglés): actualmente las interacciones realizadas son más naturales (dispositivos táctiles) y normalmente a una distancia mayor (control a distancia de dispositivos). Existe una problemática relacionada con el cambio mencionado anteriormente: para el reconocimiento de las emociones, se ha usado clásicamente el reconocimiento facial; la interacción actual con los dispositivos a una distancia mayor demanda la exploración de otras alternativas para dicho reconocimiento, mediante la voz o los gestos realizados por un usuario. Esta última alternativa será la estudiada a lo largo de este proyecto. Las aplicaciones de dicho estudio serían diversas: por ejemplo en el área de experiencia de usuario, en cuanto a que supondría una nueva manera de evaluar la usabilidad y de satisfacer plenamente sus necesidades, en combinación con otras técnicas de extracción del conocimiento como entrevistas o encuestas. Otro ejemplo importante sería en labores relacionadas con el área de recursos humanos que requieran un examen minucioso de los estados de ánimo de un individuo, como podría ser una entrevista de trabajo. El objetivo de este proyecto es concentrarse en la detección corporal y facial para la inferencia de emociones mediante la visión por computador. Para la tarea descrita anteriormente, se empleará el dispositivo Kinect de Microsoft, que ofrece un kit de desarrollo software (SDK) para dicho dispositivo que facilita la labor de reconocer tanto el esqueleto humano como las expresiones faciales y, por tanto, el posterior análisis de ellos. __________________________________________________________________________________________________________________________
Emotion, a word derived from Latin “emotĭo” (impulse) is the reaction to an external stimulus, generated from a previous experience, as well as the basis of non-verbal communication. The interaction paradigms between humans and computers are changing (the branch involved in this study is known by his acronym, HCI): nowadays the accomplished interactions are more natural (e.g. tactile devices) and normally at a greater distance (e.g. remote control devices). There is a problematic related to change previously mentioned: to emotion recognition, there is classically used the facial tracking, but the prevailing interaction with devices at a greater distance demands the exploration of other alternatives to accomplish this recognition, through voice or gestures done by an user. The last alternative will be studied along this project. The applications of this study are several: for example, in User Experience area it could suppose a new way to evaluate the usability and satisfy completely the user needs, in combination with other knowledge-extraction techniques such as interviews or surveys. Another important example could be in tasks related with Human Resources area that require a deep exam of the emotional states of an individual, such as a job interview. Since Affective Computing is such a broad field, the objective of this project is to focus on the face and body tracking to inferring emotions, by means of computer vision. For the analysis mentioned before, Microsoft Kinect device will be used with his SDK (Source Development Kit) that makes easier the task of recognize both the human skeleton as well as the facial expressions and, hence, the later analysis of it.
Emotion, a word derived from Latin “emotĭo” (impulse) is the reaction to an external stimulus, generated from a previous experience, as well as the basis of non-verbal communication. The interaction paradigms between humans and computers are changing (the branch involved in this study is known by his acronym, HCI): nowadays the accomplished interactions are more natural (e.g. tactile devices) and normally at a greater distance (e.g. remote control devices). There is a problematic related to change previously mentioned: to emotion recognition, there is classically used the facial tracking, but the prevailing interaction with devices at a greater distance demands the exploration of other alternatives to accomplish this recognition, through voice or gestures done by an user. The last alternative will be studied along this project. The applications of this study are several: for example, in User Experience area it could suppose a new way to evaluate the usability and satisfy completely the user needs, in combination with other knowledge-extraction techniques such as interviews or surveys. Another important example could be in tasks related with Human Resources area that require a deep exam of the emotional states of an individual, such as a job interview. Since Affective Computing is such a broad field, the objective of this project is to focus on the face and body tracking to inferring emotions, by means of computer vision. For the analysis mentioned before, Microsoft Kinect device will be used with his SDK (Source Development Kit) that makes easier the task of recognize both the human skeleton as well as the facial expressions and, hence, the later analysis of it.
Description
Keywords
Computación afectiva, Reconocimiento de formas, Detección facial, Detección corporal, Visión por computador