A novel paradigm for fast training data generation in asynchronous movement-based BCIs

IntroductionMovement-based brain-computer interfaces (BCIs) utilize brain activity generated during executed or attempted movement to provide control over applications. By relying on natural movement processes, these BCIs offer a more intuitive control compared to other BCI systems. However, non-inv...

Full description

Saved in:
Bibliographic Details
Main Authors: Markus R. Crell, Kyriaki Kostoglou, Kathrin Sterk, Gernot R. Müller-Putz
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-02-01
Series:Frontiers in Human Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnhum.2025.1540155/full
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:IntroductionMovement-based brain-computer interfaces (BCIs) utilize brain activity generated during executed or attempted movement to provide control over applications. By relying on natural movement processes, these BCIs offer a more intuitive control compared to other BCI systems. However, non-invasive movement-based BCIs utilizing electroencephalographic (EEG) signals usually require large amounts of training data to achieve suitable accuracy in the detection of movement intent. Additionally, patients with movement impairments require cue-based paradigms to indicate the start of a movement-related task. Such paradigms tend to introduce long delays between trials, thereby extending training times. To address this, we propose a novel experimental paradigm that enables the collection of 300 cued movement trials in 18 min.MethodsBy obtaining measurements from ten participants, we demonstrate that the data produced by this paradigm exhibits characteristics similar to those observed during self-paced movement.Results and discussionWe also show that classifiers trained on this data can be used to accurately detect executed movements with an average true positive rate of 31.8% at a maximum rate of 1.0 false positives per minute.
ISSN:1662-5161