This repository contains the core logic for an integrated brain-computer interface (BCI) and transcranial magnetic stimulation (TMS) system. The code in the repository is designed to conduct motor imagery (MI) and motor execution (ME) experiments with robot-guided TMS.
Coming soon!
Dr. Renan H. Matsuda, Matilda Makkonen, Dr. Ivan Zubarev, Dr. Olli-Pekka Kahilakoski, Leevi A. Kinnunen, Mila Nurminen, Dr. Victor H. Souza, Adj. Prof. Pantelis Lioumis
Department of Neuroscience and Biomedical Engineering, Aalto University, Espoo, Finland
This project is licensed under the GPL v3 License - see the LICENSE file for details.
The system operates within the NeuroSimo environment (designed to work with version v0.1.0), coordinating real-time EEG processing, visual stimuli, and robotically guided TMS to provide brain-state-dependent stimulation.
- Real-time EEG processing and classification: Preprocessing and decoding of motor execution/imagery states from EEG using
mneflowandsklearn. - Robot-guided TMS: Automated movement of the TMS coil to left/right primary motor cortex (M1) brain targets using tms-robot-control and invesalius for precise navigation.
- mTMS integration: Designed to work with the mtms project for multi-locus TMS stimulation.
- State-synchronized visuals: Precise visual cues using PsychoPy.
- Flexible protocol management: Modular state factory to design and sequence experimental blocks.
Handles the visual presentation throughout the experiment.
- Visuals: Displays fixation crosses, arrows, and instructional text.
- States: Implements
process_*methods for every experimental stage (Welcome, Trial, Result, ITI, etc.). - Feedback: Includes a dedicated process where classification results can be displayed in a neurofeedback fashion.
Used for the data collection for classifier training purposes.
- Data buffering: Collects EEG/EMG epochs during motor imagery/execution trials.
- Protocol: Runs a sequence of blocks with breaks in between to collect the training dataset.
- Triggers: Logs all trial events and state transitions to a CSV log file for offline analysis.
The main controller of real-time operation with EEG classification and TMS control.
- Real-time classifier: Loads a pre-trained classifier to decode ME/MI states from EEG trials.
- TMS logic: Implements the
_send_pulsemethod to stimulate specific mTMS targets (left/right M1). - Robot control: Coordinates with tms-robot-control and invesalius to command the robot to move between 'Home' and stimulation targets based on classification results.
- NeuroSimo: https://github.com/NeuroSimo/neurosimo
- mtms: https://github.com/connect2brain/mtms
- invesalius3: https://github.com/invesalius/invesalius3
- tms-robot-control: https://github.com/biomaglab/tms-robot-control
In MIME-BCI-decider.py, the user must define the following parameters for the specific brain targets:
displacement_x&displacement_yrotation_angleintensity
Important
The States, TrialIntros, Trials, and TrialResults Enum classes must be identical in both the Decider and the Presenter files. Any mismatch will cause communication failures within the system.
- Training data collection: Run
MIME-BCI-decider_training.pyin NeuroSimo while the participant performs movements or movement imagery to collect classifier training data. - Model training: Train your model by running
train_model.py. - Real-time BCI-TMS operation:
- Ensure your trained model path is correctly set in
MIME-BCI-decider.py. - Run the system in NeuroSimo to administer real-time robot-guided TMS based on the participant's brain activity.
- Ensure your trained model path is correctly set in