Skip to content

timonkl/PatchedBrainTransformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Code from the paper "Patched Brain Transformer: Flexible AI-Model for EEG Decoding"

Usage:
For unsupervised pre-training run preTraining.py with config['pre_train_bert'] =  True.
For supervised pre-training run preTraining.py with config['pre_train_bert'] =  False.

For fine tune the model run fineTune.py.

Detailed usage instructions are in the comments in the code.

Dependencies:
python = 3.9
torch 2.0.1
numpy = 1.26.0
MOABB = 0.4.6





About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages