TY - GEN
T1 - Movie Identification from Electroencephalography Response Using Convolutional Neural Network
AU - Sonawane, Dhananjay
AU - Pandey, Pankaj
AU - Mukopadhyay, Dyutiman
AU - Miyapuram, Krishna Prasad
N1 - Publisher Copyright:
© 2021, Springer Nature Switzerland AG.
PY - 2021
Y1 - 2021
N2 - Visual, audio, and emotional perception by human beings have been an interesting research topic in the past few decades. Electroencephalography (EEG) signals are one of the ways to represent human brain activity. It has been shown, that different brain networks correspond to processes corresponding to varieties of emotional stimuli. In this paper, we demonstrate a deep learning architecture for the movie identification task from the EEG response using Convolutional Neural Network (CNN). The dataset includes nine movie clips that span across different emotional states. The EEG time series data has been collected for 20 participants. Given one second EEG response of particular participant, we tried to predict its corresponding movie ID. We have also discussed the various pre-processing steps for data cleaning and data augmentation process. All the participants have been considered in both train and test data. We obtained 80.22% test accuracy for this movie classification task. We also tried cross participant testing using the same model and the performance was poor for the unseen participants. Our result gives insight toward the creation of identifiable patterns in the brain during audiovisual perception.
AB - Visual, audio, and emotional perception by human beings have been an interesting research topic in the past few decades. Electroencephalography (EEG) signals are one of the ways to represent human brain activity. It has been shown, that different brain networks correspond to processes corresponding to varieties of emotional stimuli. In this paper, we demonstrate a deep learning architecture for the movie identification task from the EEG response using Convolutional Neural Network (CNN). The dataset includes nine movie clips that span across different emotional states. The EEG time series data has been collected for 20 participants. Given one second EEG response of particular participant, we tried to predict its corresponding movie ID. We have also discussed the various pre-processing steps for data cleaning and data augmentation process. All the participants have been considered in both train and test data. We obtained 80.22% test accuracy for this movie classification task. We also tried cross participant testing using the same model and the performance was poor for the unseen participants. Our result gives insight toward the creation of identifiable patterns in the brain during audiovisual perception.
UR - https://www.scopus.com/pages/publications/85115870285
UR - https://www.scopus.com/pages/publications/85115870285#tab=citedBy
U2 - 10.1007/978-3-030-86993-9_25
DO - 10.1007/978-3-030-86993-9_25
M3 - Conference contribution
AN - SCOPUS:85115870285
SN - 9783030869922
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 267
EP - 276
BT - Brain Informatics - 14th International Conference, BI 2021, Proceedings
A2 - Mahmud, Mufti
A2 - Kaiser, M Shamim
A2 - Vassanelli, Stefano
A2 - Dai, Qionghai
A2 - Zhong, Ning
PB - Springer Science and Business Media Deutschland GmbH
T2 - 14th International Conference on Brain Informatics, BI 2021
Y2 - 17 September 2021 through 19 September 2021
ER -