Real-time emotion recognition using electroencephalography (EEG) is a growing area of interest with promising applications in mental health monitoring, human–computer interaction, and adaptive environments. While conventional multi-channel scalp EEG systems offer high accuracy, their complexity and intrusiveness limit their utility in real-world settings. This study explores the feasibility of emotion recognition using a single-channel, wearable ear-EEG system, specifically an electrode placed over the right mastoid region (TP10). The aim is to assess whether such a minimalist setup can yield reliable emotional state classification based on EEG features. Twenty-four participants took part in an emotional elicitation experiment involving a set of pre-validated video clips, each designed to evoke specific emotional states such as happiness, sadness, tension, relaxation, and neutral. EEG signals were recorded using the OpenBCI Cyton board at sampling frequency (FS) of 250 Hz and Ganglion board at 200 Hz and self-reported ratings of valence and arousal were collected after each video using the Self-Assessment Manikin (SAM). Preprocessing included bandpass filtering (1–45 Hz), notch filtering at 50 Hz or 60 Hz, and artifact rejection based on a ±60 µV threshold. To ensure signal quality, subject selection was based on the presence of a clear alpha peak and the Berger effect, which reflects increased alpha power in the eyes-closed (EC) compared to eyes-open (EO) condition. Ultimately, data from 19 participants were retained for analysis. Power spectral density (PSD) was computed using the Welch method on 15-second segments of EEG data with a 2-second shift. From these segments, absolute and relative band power (Delta, Theta, Alpha, Beta, Gamma), Approximate Entrop Measures (ApEN), and spectral asymmetry ratios Temporal Alpha-to-Beta Ratio (TABR) and Temporal Alpha-to-Gamma Ratio (TAGR) were extracted. Statistical tests including Analysis of Variance (ANOVA), Wilcoxon signed-rank test, and Mann–Whitney U test—were applied depending on normality assumptions. Significance was determined at p < 0.05, and Cohen’s d was calculated to assess effect sizes, along with 95% confidence intervals (CI) to evaluate the robustness of observed differences. Results demonstrated that features such as relative and absolute Gamma and Beta power, ApEN and ratios (TAGR, TABR) consistently differentiated between emotional states especially between low valence and high valence (LV VS HV) and neutral versus tenses comparisons. Subject-level analyses showed inter-individual variability in EEG responsiveness, while group-level comparisons revealed robust patterns in differentiating discrete comparisons( such as Sad vs Happy, Sad vs Tense and Neutral vs Sad) especially in Relative-Alpha, Relative-Beta and Relative-Gamma bands ,TAGR, TABR and ApEN . These findings demonstrated that even with a single-channel Ear EEG, it is possible to detect meaningful emotional signals, reinforcing the potential of Wearable Ear-EEG systems for emotion monitoring in everyday settings.

Development of a measurement procedure for emotion recognition using single-channel electrode EEG

FEYISA, FASIKA TESHOME
2024/2025

Abstract

Real-time emotion recognition using electroencephalography (EEG) is a growing area of interest with promising applications in mental health monitoring, human–computer interaction, and adaptive environments. While conventional multi-channel scalp EEG systems offer high accuracy, their complexity and intrusiveness limit their utility in real-world settings. This study explores the feasibility of emotion recognition using a single-channel, wearable ear-EEG system, specifically an electrode placed over the right mastoid region (TP10). The aim is to assess whether such a minimalist setup can yield reliable emotional state classification based on EEG features. Twenty-four participants took part in an emotional elicitation experiment involving a set of pre-validated video clips, each designed to evoke specific emotional states such as happiness, sadness, tension, relaxation, and neutral. EEG signals were recorded using the OpenBCI Cyton board at sampling frequency (FS) of 250 Hz and Ganglion board at 200 Hz and self-reported ratings of valence and arousal were collected after each video using the Self-Assessment Manikin (SAM). Preprocessing included bandpass filtering (1–45 Hz), notch filtering at 50 Hz or 60 Hz, and artifact rejection based on a ±60 µV threshold. To ensure signal quality, subject selection was based on the presence of a clear alpha peak and the Berger effect, which reflects increased alpha power in the eyes-closed (EC) compared to eyes-open (EO) condition. Ultimately, data from 19 participants were retained for analysis. Power spectral density (PSD) was computed using the Welch method on 15-second segments of EEG data with a 2-second shift. From these segments, absolute and relative band power (Delta, Theta, Alpha, Beta, Gamma), Approximate Entrop Measures (ApEN), and spectral asymmetry ratios Temporal Alpha-to-Beta Ratio (TABR) and Temporal Alpha-to-Gamma Ratio (TAGR) were extracted. Statistical tests including Analysis of Variance (ANOVA), Wilcoxon signed-rank test, and Mann–Whitney U test—were applied depending on normality assumptions. Significance was determined at p < 0.05, and Cohen’s d was calculated to assess effect sizes, along with 95% confidence intervals (CI) to evaluate the robustness of observed differences. Results demonstrated that features such as relative and absolute Gamma and Beta power, ApEN and ratios (TAGR, TABR) consistently differentiated between emotional states especially between low valence and high valence (LV VS HV) and neutral versus tenses comparisons. Subject-level analyses showed inter-individual variability in EEG responsiveness, while group-level comparisons revealed robust patterns in differentiating discrete comparisons( such as Sad vs Happy, Sad vs Tense and Neutral vs Sad) especially in Relative-Alpha, Relative-Beta and Relative-Gamma bands ,TAGR, TABR and ApEN . These findings demonstrated that even with a single-channel Ear EEG, it is possible to detect meaningful emotional signals, reinforcing the potential of Wearable Ear-EEG systems for emotion monitoring in everyday settings.
2024
2025-07-14
Development of a measurement procedure for emotion recognition using single-channel electrode EEG
File in questo prodotto:
File Dimensione Formato  
thesis+fasikaf.pdf

non disponibili

Descrizione: it is my thesis document
Dimensione 4.24 MB
Formato Adobe PDF
4.24 MB Adobe PDF

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12075/21985