Skip to content

connect2brain/bci-tms

Repository files navigation

BCI-TMS control system

This repository contains the core logic for an integrated brain-computer interface (BCI) and transcranial magnetic stimulation (TMS) system. The code in the repository is designed to conduct motor imagery (MI) and motor execution (ME) experiments with robot-guided TMS.

References

Coming soon!

Key contributors

Dr. Renan H. Matsuda, Matilda Makkonen, Dr. Ivan Zubarev, Dr. Olli-Pekka Kahilakoski, Leevi A. Kinnunen, Mila Nurminen, Dr. Victor H. Souza, Adj. Prof. Pantelis Lioumis

Department of Neuroscience and Biomedical Engineering, Aalto University, Espoo, Finland


License

This project is licensed under the GPL v3 License - see the LICENSE file for details.


Overview

The system operates within the NeuroSimo environment (designed to work with version v0.1.0), coordinating real-time EEG processing, visual stimuli, and robotically guided TMS to provide brain-state-dependent stimulation.

Key features:

  • Real-time EEG processing and classification: Preprocessing and decoding of motor execution/imagery states from EEG using mneflow and sklearn.
  • Robot-guided TMS: Automated movement of the TMS coil to left/right primary motor cortex (M1) brain targets using tms-robot-control and invesalius for precise navigation.
  • mTMS integration: Designed to work with the mtms project for multi-locus TMS stimulation.
  • State-synchronized visuals: Precise visual cues using PsychoPy.
  • Flexible protocol management: Modular state factory to design and sequence experimental blocks.

Core components

1. MIME-BCI-presenter.py

Handles the visual presentation throughout the experiment.

  • Visuals: Displays fixation crosses, arrows, and instructional text.
  • States: Implements process_* methods for every experimental stage (Welcome, Trial, Result, ITI, etc.).
  • Feedback: Includes a dedicated process where classification results can be displayed in a neurofeedback fashion.

2. MIME-BCI-decider_training.py

Used for the data collection for classifier training purposes.

  • Data buffering: Collects EEG/EMG epochs during motor imagery/execution trials.
  • Protocol: Runs a sequence of blocks with breaks in between to collect the training dataset.
  • Triggers: Logs all trial events and state transitions to a CSV log file for offline analysis.

3. MIME-BCI-decider.py

The main controller of real-time operation with EEG classification and TMS control.

  • Real-time classifier: Loads a pre-trained classifier to decode ME/MI states from EEG trials.
  • TMS logic: Implements the _send_pulse method to stimulate specific mTMS targets (left/right M1).
  • Robot control: Coordinates with tms-robot-control and invesalius to command the robot to move between 'Home' and stimulation targets based on classification results.

Related projects


Configuration & requirements

mTMS target definition

In MIME-BCI-decider.py, the user must define the following parameters for the specific brain targets:

  • displacement_x & displacement_y
  • rotation_angle
  • intensity

State synchronization

Important

The States, TrialIntros, Trials, and TrialResults Enum classes must be identical in both the Decider and the Presenter files. Any mismatch will cause communication failures within the system.


Workflow

  1. Training data collection: Run MIME-BCI-decider_training.py in NeuroSimo while the participant performs movements or movement imagery to collect classifier training data.
  2. Model training: Train your model by running train_model.py.
  3. Real-time BCI-TMS operation:
    • Ensure your trained model path is correctly set in MIME-BCI-decider.py.
    • Run the system in NeuroSimo to administer real-time robot-guided TMS based on the participant's brain activity.

About

This repository contains the core logic for an integrated brain-computer interface (BCI) and transcranial magnetic stimulation (TMS) system.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages