Please use this identifier to cite or link to this item: http://hdl.handle.net/10400.5/99759
Title: A comparative analysis of the effectiveness of contrastive learning for P300 classification
Author: Conceição, David Miguel Baptista
Advisor: Fonseca, Manuel João Caneira Monteiro da
Keywords: Interfaces Cérebro-Computador
P300
Classificação de sinais
Aprendizagem Contrastiva
Redes Neuronais
Teses de mestrado - 2025
Defense Date: 2025
Abstract: Brain-Computer Interfaces (BCIs) translate brain activity into precise commands for controlling external devices, providing crucial assistance to individuals with motor disabilities. Within this domain, P300 Speller systems serve as a vital communication tool. However, these systems face significant challenges, including the inherently noisy nature of EEG data and the high variability of signals both across and within individuals. This variability often leads these systems to require an extensive calibration phase, reducing the practicality of P300 spellers in real-world applications. Developing robust models that generalize effectively across users is therefore a key objective in this field. Contrastive learning approaches have recently gained attention for their ability to produce models capable of extracting meaningful features from data. While these techniques have shown great success in domains like computer vision and natural language processing, their application to the P300 paradigm remains underexplored. This study addresses this gap by conducting a comparative analysis of three pre-training strategies: SimCLR, SupCon, and Supervised learning. These approaches were evaluated using three state-of-the-art neural network architectures, EEGNet, EEG-Inception, and Conformer. The pre-trained models were evaluated in both intra-dataset and cross-dataset scenarios. The results indicate that the impact of contrastive learning is highly dependent on the model architecture and on the evaluation setup. In an intra-dataset scenario, SimCLR exhibited the worse performances across all models, while SupCon and Supervised displayed comparable results with SupCon achieving slightly higher performances for higher values of retraining data. However, in cross-dataset scenarios, contrastive learning approaches, particularly SimCLR, demonstrated superior performance compared to supervised learning, showcasing their ability to generalize effectively across diverse data distributions. These findings underscore the potential of contrastive learning and their robustness to address the challenges of variability and reduce the reliance on extensive calibration in P300 speller systems, by leveraging data coming from multiple sources.
Description: Tese de Mestrado, Informática, 2025, Universidade de Lisboa, Faculdade de Ciências
URI: http://hdl.handle.net/10400.5/99759
Designation: Mestrado em Informática
Appears in Collections:FC-DI - Master Thesis (dissertation)

Files in This Item:
File Description SizeFormat 
TM_David_Conceição.pdf3,78 MBAdobe PDFView/Open


FacebookTwitterDeliciousLinkedInDiggGoogle BookmarksMySpace
Formato BibTex MendeleyEndnote 

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.