Examinando por Autor "Gomez A."
Mostrando 1 - 2 de 2
Resultados por página
Opciones de ordenación
Ítem Emotion Recognition from EEG and Facial Expressions: A Multimodal Approach(Institute of Electrical and Electronics Engineers Inc., 2018-01-01) Chaparro V.; Gomez A.; Salgado A.; Quintero O.L.; Lopez N.; Villa L.F.; Chaparro V.; Gomez A.; Salgado A.; Quintero O.L.; Lopez N.; Villa L.F.; Universidad EAFIT. Departamento de Ciencias; Modelado MatemáticoThe understanding of a psychological phenomena such as emotion is of paramount importance for psychologists, since it allows to recognize a pathology and to prescribe a due treatment for a patient. While approaching this problem, mathematicians and computational science engineers have proposed different unimodal techniques for emotion recognition from voice, electroencephalography, facial expression, and physiological data. It is also well known that identifying emotions is a multimodal process. The main goal in this work is to train a computer to do so. In this paper we will present our first approach to a multimodal emotion recognition via data fusion of Electroencephalography and facial expressions. The selected strategy was a feature-level fusion of both Electroencephalography and facial microexpressions, and the classification schemes used were a neural network model and a random forest classifier. Experimental set up was out with the balanced multimodal database MAHNOB-HCI. Results are promising compared to results from other authors with a 97% of accuracy. The feature-level fusion approach used in this work improves our unimodal techniques up to 12% per emotion. Therefore, we may conclude that our simple but effective approach improves the overall results of accuracy. © 2018 IEEE.Ítem Emotional Networked maps from EEG signals(Institute of Electrical and Electronics Engineers Inc., 2020-01-01) Gomez A.; Quintero O.L.; Lopez-Celani N.; Villa L.F.; Gomez A.; Quintero O.L.; Lopez-Celani N.; Villa L.F.; Universidad EAFIT. Departamento de Ciencias; Modelado MatemáticoThe EEG has showed that contains relevant information about recognition of emotional states. It is important to analyze the EEG signals to understand the emotional states not only from a time series approach but also determining the importance of the generating process of these signals, the location of electrodes and the relationship between the EEG signals. From the EEG signals of each emotional state, a functional connectivity measurement was used to construct adjacency matrices: lagged phase synchronization (LPS), averaging adjacency matrices we built a prototype network for each emotion. Based on these networks, we extracted a set node features seeking to understand their behavior and the relationship between them. We found through the strength and degree, the group of representative electrodes for each emotional state, finding differences from intensity of measurement and the spatial location of these electrodes. In addition, analyzing the cluster coefficient, degree, and strength, we find differences between the networks from the spatial patterns associated with the electrodes with the highest coefficient. This analysis can also gain evidence from the connectivity elements shared between emotional states, allowing to cluster emotions and concluding about the relationship of emotions from EEG perspective. © 2020 IEEE.