-
Notifications
You must be signed in to change notification settings - Fork 0
License
timonkl/PatchedBrainTransformer
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Code from the paper "Patched Brain Transformer: Flexible AI-Model for EEG Decoding" Usage: For unsupervised pre-training run preTraining.py with config['pre_train_bert'] = True. For supervised pre-training run preTraining.py with config['pre_train_bert'] = False. For fine tune the model run fineTune.py. Detailed usage instructions are in the comments in the code. Dependencies: python = 3.9 torch 2.0.1 numpy = 1.26.0 MOABB = 0.4.6
About
No description, website, or topics provided.
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published