Real Time Classification of Emotions to Control Stage Lighting During Dance Performance

Date

2016-08

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Recently, there has been a growing research in the field of Electroencephalography (EEG) based recognition of emotions known as affective computing, where the subjects are either shown pictures to elicit the necessary emotional response or made to imagine a particular situation to produce the desired emotion. Research has shown that different emotions affect the brain waves differently thus leading to further research in computerized recognition of human emotions [1] [2] [3]. In my current master’s thesis, I have analyzed the neural (EEG) data recordings during emotional dance performance from 2 trained dancers. This processed data was used to control the stage lighting color (with changing emotions). Data from subject 1 and subject 2 was used to train the classifier offline. The classification was done by use of Artificial Neural Network. Four musical pieces (details in the method section) were selected by the dancers, each representing a particular emotion – “Anger”, “Fear”, “Neutral” and “Happy”. These emotions were so selected to cover the emotional range of positive, negative and natural emotions. The feature type of ASM12 [4] with temporal resolution of one second and 50% overlapping hamming window was used. The sub band frequency range - delta (1-3 Hz), theta (4-7 Hz), alpha (8-12 Hz) and beta (14-30 Hz) were used for each of the symmetric electrode pair. The results showed a high level of accuracy of 72.1% was obtained for subject 1 and an accuracy of 75.7% was obtained for subject 2 obtained during offline model training and testing of model using multilayer neural network with 1 hidden layer and 32 hidden layer units. The real-time accuracy was low, and could majorly classify two emotional classes.

Description

Keywords

EEG, Real Time, Emotion Recognition

Citation