Humans have the ability to portray different expressions contrary to the emotional state of mind. Therefore, it is difficult to judge the human's real emotional state simply by judging the physical appearance. Although researchers are working on facial expressions analysis, voice recognition, gesture recognition accuracy levels of such analysis are much less and the results are not reliable. Classifying the human emotions with machine learning models and extracting discrete wavelet features of Electroencephalogram (EEG) is proposed. The EEG data from Database for Emotion Analysis using Physiological signal (DEAP) online datasets is used for analysis and consists of peripheral biological signals as well as EEG recordings. EEG signal is collected from 32 subjects while watching 40 1-min-long music videos. Each video clip is rated by the participants in terms of the level of Valence, Arousal, Dominance. In the proposed work we have considered a significant band of EEG with a reduced frontal electrode (Fp1, F3, F4, Fp2) to get a comparable good result. The accuracy obtained from K- nearest neighbour (KNN), Fine KNN and Support Vector Machine (SVM) are 92.5%, 90% and 90% respectively for Valence, Arousal and Dominance.
|Publication status||Published - 12-2022|
All Science Journal Classification (ASJC) codes
- Electronic, Optical and Magnetic Materials
- Mechanics of Materials
- Industrial and Manufacturing Engineering
- Electrical and Electronic Engineering