Real Time Classification of Emotions to Control Stage Lighting During Dance Performance

dc.contributor.advisorContreras-Vidal, Jose L.
dc.contributor.committeeMemberOmurtag, Ahmet
dc.contributor.committeeMemberPrasad, Saurabh
dc.creatorLnu, Shruti Ray
dc.creator.orcid0000-0001-5837-9605
dc.date.accessioned2018-12-03T21:01:50Z
dc.date.available2018-12-03T21:01:50Z
dc.date.createdAugust 2016
dc.date.issued2016-08
dc.date.submittedAugust 2016
dc.date.updated2018-12-03T21:01:51Z
dc.description.abstractRecently, there has been a growing research in the field of Electroencephalography (EEG) based recognition of emotions known as affective computing, where the subjects are either shown pictures to elicit the necessary emotional response or made to imagine a particular situation to produce the desired emotion. Research has shown that different emotions affect the brain waves differently thus leading to further research in computerized recognition of human emotions [1] [2] [3]. In my current master’s thesis, I have analyzed the neural (EEG) data recordings during emotional dance performance from 2 trained dancers. This processed data was used to control the stage lighting color (with changing emotions). Data from subject 1 and subject 2 was used to train the classifier offline. The classification was done by use of Artificial Neural Network. Four musical pieces (details in the method section) were selected by the dancers, each representing a particular emotion – “Anger”, “Fear”, “Neutral” and “Happy”. These emotions were so selected to cover the emotional range of positive, negative and natural emotions. The feature type of ASM12 [4] with temporal resolution of one second and 50% overlapping hamming window was used. The sub band frequency range - delta (1-3 Hz), theta (4-7 Hz), alpha (8-12 Hz) and beta (14-30 Hz) were used for each of the symmetric electrode pair. The results showed a high level of accuracy of 72.1% was obtained for subject 1 and an accuracy of 75.7% was obtained for subject 2 obtained during offline model training and testing of model using multilayer neural network with 1 hidden layer and 32 hidden layer units. The real-time accuracy was low, and could majorly classify two emotional classes.
dc.description.departmentBiomedical Engineering, Department of
dc.format.digitalOriginborn digital
dc.format.mimetypeapplication/pdf
dc.identifier.urihttp://hdl.handle.net/10657/3613
dc.language.isoeng
dc.rightsThe author of this work is the copyright owner. UH Libraries and the Texas Digital Library have their permission to store and provide access to this work. Further transmission, reproduction, or presentation of this work is prohibited except with permission of the author(s).
dc.subjectEEG
dc.subjectReal Time
dc.subjectEmotion Recognition
dc.titleReal Time Classification of Emotions to Control Stage Lighting During Dance Performance
dc.type.dcmiText
dc.type.genreThesis
thesis.degree.collegeCullen College of Engineering
thesis.degree.departmentBiomedical Engineering, Department of
thesis.degree.disciplineBiomedical Engineering
thesis.degree.grantorUniversity of Houston
thesis.degree.levelMasters
thesis.degree.nameMaster of Science

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
LNU-THESIS-2016.pdf
Size:
3 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
LICENSE.txt
Size:
1.81 KB
Format:
Plain Text
Description: