The brain regulates bodily functions, including emotions, memory, and movement, through neurons and glial cells. Emotions, rooted in the brain’s activity, influence decision-making, social interactions, and responses to stress. Recognizing emotions improves communication and behaviour. Electroencephalography (EEG) is a key tool for studying brain activity, offering insights into mental states like attention and emotional involvement. The aim of this study is to investigate whether is possible to use EEG-derived engagement indexes to implement models for human emotion classification, exploiting machine learning techniques. To do so we have analysed the EEG of a population while watching evocative short videos without audio. EEG data was acquired from 145 subjects, part of a freely available database on Zenodo, named Emotion Arousal Pattern (EMAP), specifically designed to support research investigating emotional states induced by videos, selected to induce a wide range of emotional states. The emotions evoked, assed with a continuous scale from 0 (not at all) to 10 (very strongly), are classified in: “Anger”, “Happy”, “Sadness”, “Disgust” and “Fear”. EEG recordings were preprocessed using EEGLAB software. To ensure that the data were clean, sliding windows were extracted and the noisy ones were removed. The preprocessing involved 0.5-70 Hz band-pass filtering, average-based re-referencing, and artifact removal after independent component analysis using the ICLabel plug-in. Then, EEG rhythms were extracted and their powers computed, and 37 involvement ratio indexes were computed according to the review of Marcantoni et al. To visualize differences for the different emotions, we plotted the topographic maps for each rhythm’s power and engagement index’s score. Then, for each of the five emotions considered, using LASSO regression, we were able to develop reliable classification models, as proved by the R2 coefficients of the models (“Anger” 0.7873, “Happy” 0.8106, “Sadness” 0.7859, “Disgust” 0.7908 and “Fear” 0.8045). The analysis showed that nearly all EEG channels contributed to emotion discrimination, while a couple of features, among rhythms and engagement indexes, are less informative (I11 in 75.62%, I12 in 68.44%, I15 in 63.12%, and I21 in 75% of channels). This finding highlights the importance of considering a comprehensive coverage of brain activity and the need to incorporate all brain rhythms and engagement indexes in emotion classification tasks. This study also underscored the complexity of using EEG for emotion recognition, showing that a holistic approach, one that accounts for a wide array of features and channels, is crucial to develop accurate models. The significant role of machine learning techniques like LASSO regression further emphasizes their potential in handling high dimensional data and selecting the most informative features, allowing for more reliable and interpretable emotion classification models.

The brain regulates bodily functions, including emotions, memory, and movement, through neurons and glial cells. Emotions, rooted in the brain’s activity, influence decision-making, social interactions, and responses to stress. Recognizing emotions improves communication and behaviour. Electroencephalography (EEG) is a key tool for studying brain activity, offering insights into mental states like attention and emotional involvement. The aim of this study is to investigate whether is possible to use EEG-derived engagement indexes to implement models for human emotion classification, exploiting machine learning techniques. To do so we have analysed the EEG of a population while watching evocative short videos without audio. EEG data was acquired from 145 subjects, part of a freely available database on Zenodo, named Emotion Arousal Pattern (EMAP), specifically designed to support research investigating emotional states induced by videos, selected to induce a wide range of emotional states. The emotions evoked, assed with a continuous scale from 0 (not at all) to 10 (very strongly), are classified in: “Anger”, “Happy”, “Sadness”, “Disgust” and “Fear”. EEG recordings were preprocessed using EEGLAB software. To ensure that the data were clean, sliding windows were extracted and the noisy ones were removed. The preprocessing involved 0.5-70 Hz band-pass filtering, average-based re-referencing, and artifact removal after independent component analysis using the ICLabel plug-in. Then, EEG rhythms were extracted and their powers computed, and 37 involvement ratio indexes were computed according to the review of Marcantoni et al. To visualize differences for the different emotions, we plotted the topographic maps for each rhythm’s power and engagement index’s score. Then, for each of the five emotions considered, using LASSO regression, we were able to develop reliable classification models, as proved by the R2 coefficients of the models (“Anger” 0.7873, “Happy” 0.8106, “Sadness” 0.7859, “Disgust” 0.7908 and “Fear” 0.8045). The analysis showed that nearly all EEG channels contributed to emotion discrimination, while a couple of features, among rhythms and engagement indexes, are less informative (I11 in 75.62%, I12 in 68.44%, I15 in 63.12%, and I21 in 75% of channels). This finding highlights the importance of considering a comprehensive coverage of brain activity and the need to incorporate all brain rhythms and engagement indexes in emotion classification tasks. This study also underscored the complexity of using EEG for emotion recognition, showing that a holistic approach, one that accounts for a wide array of features and channels, is crucial to develop accurate models. The significant role of machine learning techniques like LASSO regression further emphasizes their potential in handling high dimensional data and selecting the most informative features, allowing for more reliable and interpretable emotion classification models.

Human emotion recognition based on electroencephalogram analysis

DELL'ORLETTA, ALESSANDRO
2023/2024

Abstract

The brain regulates bodily functions, including emotions, memory, and movement, through neurons and glial cells. Emotions, rooted in the brain’s activity, influence decision-making, social interactions, and responses to stress. Recognizing emotions improves communication and behaviour. Electroencephalography (EEG) is a key tool for studying brain activity, offering insights into mental states like attention and emotional involvement. The aim of this study is to investigate whether is possible to use EEG-derived engagement indexes to implement models for human emotion classification, exploiting machine learning techniques. To do so we have analysed the EEG of a population while watching evocative short videos without audio. EEG data was acquired from 145 subjects, part of a freely available database on Zenodo, named Emotion Arousal Pattern (EMAP), specifically designed to support research investigating emotional states induced by videos, selected to induce a wide range of emotional states. The emotions evoked, assed with a continuous scale from 0 (not at all) to 10 (very strongly), are classified in: “Anger”, “Happy”, “Sadness”, “Disgust” and “Fear”. EEG recordings were preprocessed using EEGLAB software. To ensure that the data were clean, sliding windows were extracted and the noisy ones were removed. The preprocessing involved 0.5-70 Hz band-pass filtering, average-based re-referencing, and artifact removal after independent component analysis using the ICLabel plug-in. Then, EEG rhythms were extracted and their powers computed, and 37 involvement ratio indexes were computed according to the review of Marcantoni et al. To visualize differences for the different emotions, we plotted the topographic maps for each rhythm’s power and engagement index’s score. Then, for each of the five emotions considered, using LASSO regression, we were able to develop reliable classification models, as proved by the R2 coefficients of the models (“Anger” 0.7873, “Happy” 0.8106, “Sadness” 0.7859, “Disgust” 0.7908 and “Fear” 0.8045). The analysis showed that nearly all EEG channels contributed to emotion discrimination, while a couple of features, among rhythms and engagement indexes, are less informative (I11 in 75.62%, I12 in 68.44%, I15 in 63.12%, and I21 in 75% of channels). This finding highlights the importance of considering a comprehensive coverage of brain activity and the need to incorporate all brain rhythms and engagement indexes in emotion classification tasks. This study also underscored the complexity of using EEG for emotion recognition, showing that a holistic approach, one that accounts for a wide array of features and channels, is crucial to develop accurate models. The significant role of machine learning techniques like LASSO regression further emphasizes their potential in handling high dimensional data and selecting the most informative features, allowing for more reliable and interpretable emotion classification models.
2023
2025-02-17
Human emotion recognition based on electroencephalogram analysis
The brain regulates bodily functions, including emotions, memory, and movement, through neurons and glial cells. Emotions, rooted in the brain’s activity, influence decision-making, social interactions, and responses to stress. Recognizing emotions improves communication and behaviour. Electroencephalography (EEG) is a key tool for studying brain activity, offering insights into mental states like attention and emotional involvement. The aim of this study is to investigate whether is possible to use EEG-derived engagement indexes to implement models for human emotion classification, exploiting machine learning techniques. To do so we have analysed the EEG of a population while watching evocative short videos without audio. EEG data was acquired from 145 subjects, part of a freely available database on Zenodo, named Emotion Arousal Pattern (EMAP), specifically designed to support research investigating emotional states induced by videos, selected to induce a wide range of emotional states. The emotions evoked, assed with a continuous scale from 0 (not at all) to 10 (very strongly), are classified in: “Anger”, “Happy”, “Sadness”, “Disgust” and “Fear”. EEG recordings were preprocessed using EEGLAB software. To ensure that the data were clean, sliding windows were extracted and the noisy ones were removed. The preprocessing involved 0.5-70 Hz band-pass filtering, average-based re-referencing, and artifact removal after independent component analysis using the ICLabel plug-in. Then, EEG rhythms were extracted and their powers computed, and 37 involvement ratio indexes were computed according to the review of Marcantoni et al. To visualize differences for the different emotions, we plotted the topographic maps for each rhythm’s power and engagement index’s score. Then, for each of the five emotions considered, using LASSO regression, we were able to develop reliable classification models, as proved by the R2 coefficients of the models (“Anger” 0.7873, “Happy” 0.8106, “Sadness” 0.7859, “Disgust” 0.7908 and “Fear” 0.8045). The analysis showed that nearly all EEG channels contributed to emotion discrimination, while a couple of features, among rhythms and engagement indexes, are less informative (I11 in 75.62%, I12 in 68.44%, I15 in 63.12%, and I21 in 75% of channels). This finding highlights the importance of considering a comprehensive coverage of brain activity and the need to incorporate all brain rhythms and engagement indexes in emotion classification tasks. This study also underscored the complexity of using EEG for emotion recognition, showing that a holistic approach, one that accounts for a wide array of features and channels, is crucial to develop accurate models. The significant role of machine learning techniques like LASSO regression further emphasizes their potential in handling high dimensional data and selecting the most informative features, allowing for more reliable and interpretable emotion classification models.
File in questo prodotto:
File Dimensione Formato  
Tesi LM.pdf

accesso aperto

Dimensione 12.27 MB
Formato Adobe PDF
12.27 MB Adobe PDF Visualizza/Apri

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12075/20934