It is difficult to establish a precise definition of what is an emotion, but there is an emerging definition, which establishes that emotions are mental states.[4] These mental states are brain response to internal and external stimuli such as listen an orchestra, interaction with other human beings, or psychological and neuroendocrine sudden changes, like specifically to remember an event or an image stored in our mind.[52] These emotional states are manifested like other mental experiences, as the result of nerve activity in the brain[4] so it can be inferred that emotional states can be seen as a pattern of EEG signals.
Therefore a joint effort was undertook with the Medical Technology Laboratory (GATEME) from Universidad Nacional de San Juan, Argentina and with Psychology,
Education and Culture Research Group, from Institución Universitaria Politécnico Grancolombiano, Colombia.These efforts will focus on studying pattern recognition, concerning emotional states within electroencephalographic signals.
We also know how to work with voice
An approach to emotion recognition
Firtstly, a study was conducted by Psychology, Education and Culture Research Group of evoke emotions of mother-child dyads. The study was conducted with 8 subjects, 4 women (mothers), and 4 children (3 males and 1 female) with a mean age of 22 months. To perform the experiment the following protocol was used: First, each mother was asked to make a recording in a room, the happiest moment of her life for the stimulus of happiness, and the saddest moment of her life to the stimulus sadness, then every dyad was placed face to face; the mother had headphones and listen in each case (happy or sad) the story that her previously recorded, evoking the feeling of happiness or sadness, as the case, creating an evocation of emotions on her child who was staring at her. The state of neutrality was recorded before each session of evocation of emotions.
It was designed a simple graphical interface that allows to replicate the analysis with different signals.
A Wavelet Analysis
After evaluating the good performance of analysis by wavelet transform, we add to our study the emotional state of neutrality, but this time using classic classifiers as QDA, KNN and RFC. With an average classification of 87% for the three emotional states. << Hypervinculo Pendiente >>
More emotions and more channels
Seeking for evaluate the performance of our techniques for detecting emotional states more work with the HCI Tagging database, that includes EEG information with 34 Channels and 9 emotional states. We delimit the database to simulate the behavior of the equipment Emotiv EPOC and evaluate our algorithms for these problem, obtained and average classification of 88% for the nine emotiontal states.
Our next goal is to create our own database, for which it was designed EmotivUI.
Brain + Voice + Face = Emotions
The understanding of psychological phenomenon as emotions is a particular need for psychologists for the recognition of a pathology and to prescribe a treatment for a patient. However, to recognize emotions in people can be a real challenge when they are not in optimal conditions to communicate how they feel, as when we are dealing with children who do not talk, or elder people or with adults that had suffered some disability. Towards this problem, mathematics and computational sciences have proposed different techniques for emotion recognition from human physiological bearings as: voice, electroencephalography, facial expression, temperature and heart rate. The Mathematical Modelling research group of EAFIT has developed algorithms for emotion recognition using each of this psychological phenomenons and the main goal of this research is to carry out a deep study in the research background of GRIMMAT and unimodal approaches (voice speech, temperature, ect.) they implemented the last years in order to develop a multimodal methodology that integrates the single models, thus improving accuracy in the recognition task.