A framework for artificial data generation based on anatomical differences for electroencephalography-based brain-computer interfaces
CovenanteeTechnische Universität Berlin
Document typeMaster thesis
Rights accessOpen Access
One of the major limitations of brain-computer interfaces (BCI) is the need for a long and tedious calibration period in order for a subject to become proficient with the system. A principal challenge in training a BCI classifier that should work without user-specific calibration is that the training set is not large enough to capture the spectrum of potential signals. In this thesis, a new method to reduce BCI calibration time is proposed. Since one cause for subject-to-subject variability is the anatomical differences between subjects, we aimed to generate artificial data which would resemble the signals obtained from a new subject with a different cortical anatomy. This would allow for a large expansion of the training set size. To generate the artificial data we begin by decomposing the original signals, localizing the most prominent sources and shifting their orientation relative to the cortex. New signals are then regenerated using different head models. Training a classifier on the enriched training set should result in better generalizability. Although inter-subject classification ultimately fell outside the scope of this thesis, we consider intra-subject classification as a starting point for consideration of the methods applied. This ultimately lays the foundation for a much greater field of research involving the use of artificial data generation to combat the calibration time issue for BCIs.