Graduation Year

2020

Document Type

Thesis

Degree

M.S.C.S.

Degree Name

MS in Computer Science (M.S.C.S.)

Degree Granting Department

Computer Science and Engineering

Major Professor

Marvin Andujar, Ph.D.

Committee Member

Shaun Canavan, Ph.D.

Committee Member

Paul A. Rosen, Ph.D.

Keywords

Affective Computing, DEAP, Deep Neural Networks, EEG

Abstract

Recognizing emotions is very important while building robust and interactive Affective Brain-Computer Interfaces as it allows the machines to have some degree of emotional intelligence with the help of which they can understand the changing emotional state of users. In the past, emotions have been recognized via unimodal data such as electroencephalography (EEG) signals, speech, facial expressions or peripheral physiological signals. However, emotions are complex as they are a combination of human behavior, thinking and feeling. Therefore, as compared to unimodal methods, multi-modal techniques, recognize emotions with more reliability. This thesis aims to recognize and classify human emotions into high/low arousal and high/low valence using a multi-modal approach. The different modalities used are EEG, blood pressure, respiration, skin temperature, eye movements, muscle movements and skin conductance. The data is taken from a publicly available dataset called DEAP. The experiments are performed using the 1D Convolutional LSTM network and its performance is then compared with three baseline Machine Learning (ML) algorithms – Support Vector Machine, K-Nearest Neighbor and Random Forest. To investigate further, the emotion classification performance of different regions of the brain such as frontal lobe, parietal lobe, temporal lobe, occipital lobe, left hemisphere and right hemisphere are also compared. The model achieved an average accuracy of 91.19% for valence and 91.51% for arousal when used with a combination of EEG and peripheral data. The overall results show that the proposed neural network outperforms the traditional ML algorithms and gives a high emotion classification accuracy.

Share

COinS