Emotion Recognition from EEG and Facial Expressions: A Multimodal Approach

dc.contributor.authorChaparro V.
dc.contributor.authorGomez A.
dc.contributor.authorSalgado A.
dc.contributor.authorQuintero O.L.
dc.contributor.authorLopez N.
dc.contributor.authorVilla L.F.
dc.contributor.departmentUniversidad EAFIT. Departamento de Cienciasspa
dc.contributor.researchgroupModelado Matemáticospa
dc.creatorChaparro V.
dc.creatorGomez A.
dc.creatorSalgado A.
dc.creatorQuintero O.L.
dc.creatorLopez N.
dc.creatorVilla L.F.
dc.date.accessioned2021-04-12T14:11:49Z
dc.date.available2021-04-12T14:11:49Z
dc.date.issued2018-01-01
dc.description.abstractThe understanding of a psychological phenomena such as emotion is of paramount importance for psychologists, since it allows to recognize a pathology and to prescribe a due treatment for a patient. While approaching this problem, mathematicians and computational science engineers have proposed different unimodal techniques for emotion recognition from voice, electroencephalography, facial expression, and physiological data. It is also well known that identifying emotions is a multimodal process. The main goal in this work is to train a computer to do so. In this paper we will present our first approach to a multimodal emotion recognition via data fusion of Electroencephalography and facial expressions. The selected strategy was a feature-level fusion of both Electroencephalography and facial microexpressions, and the classification schemes used were a neural network model and a random forest classifier. Experimental set up was out with the balanced multimodal database MAHNOB-HCI. Results are promising compared to results from other authors with a 97% of accuracy. The feature-level fusion approach used in this work improves our unimodal techniques up to 12% per emotion. Therefore, we may conclude that our simple but effective approach improves the overall results of accuracy. © 2018 IEEE.eng
dc.identifierhttps://eafit.fundanetsuite.com/Publicaciones/ProdCientif/PublicacionFrw.aspx?id=8481
dc.identifier.doi10.1109/EMBC.2018.8512407
dc.identifier.issn05891019
dc.identifier.issn1557170X
dc.identifier.otherPUBMED;30440451
dc.identifier.otherSCOPUS;2-s2.0-85056655879
dc.identifier.urihttp://hdl.handle.net/10784/27905
dc.language.isoengeng
dc.publisherInstitute of Electrical and Electronics Engineers Inc.
dc.relation.urihttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85056655879&doi=10.1109%2fEMBC.2018.8512407&partnerID=40&md5=fb06ef7c87c0ab62281ad17f00d1ece4
dc.rightsInstitute of Electrical and Electronics Engineers Inc.
dc.sourceIEEE Engineering in Medicine and Biology Society Conference Proceedings
dc.subject.keywordelectroencephalographyeng
dc.subject.keywordemotioneng
dc.subject.keywordfacial expressioneng
dc.subject.keywordfactual databaseeng
dc.subject.keywordhumaneng
dc.subject.keywordDatabases, Factualeng
dc.subject.keywordElectroencephalographyeng
dc.subject.keywordEmotionseng
dc.subject.keywordFacial Expressioneng
dc.subject.keywordHumanseng
dc.titleEmotion Recognition from EEG and Facial Expressions: A Multimodal Approacheng
dc.typeinfo:eu-repo/semantics/conferencePapereng
dc.typeconferencePapereng
dc.typeinfo:eu-repo/semantics/publishedVersioneng
dc.typepublishedVersioneng
dc.type.localDocumento de conferenciaspa

Archivos