Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clean-up, update packaging, fix imports #71

Open
wants to merge 34 commits into
base: master
Choose a base branch
from

Conversation

mscheltienne
Copy link
Member

@mscheltienne mscheltienne commented Apr 22, 2024

This package is barely maintained, but I still see a question on the forum from time to time; or someone who directly contacts me with questions regarding this package.
This PR fixes the deprecated imports, fix styles, updates the packaging,..

WIP

@mscheltienne
Copy link
Member Author

@dengemann Do you remember what the dictionary.txt file is and if it's actually needed? https://github.com/mne-tools/mne-hcp/blob/master/dictionary.txt It was added as part of the first commit.

Also, @dengemann or @alexrockhill would you have the datasets required to run the tutorials and examples? It's almost the last piece missing to get the build green.

@larsoner
Copy link
Member

That looks like a codespell dictionary. Nowadays you shouldn't need to package it separately, just install codespell and use its built in directories

@mscheltienne
Copy link
Member Author

That's what I assumed and I removed it ;)

@alexrockhill
Copy link

@dengemann Do you remember what the dictionary.txt file is and if it's actually needed? https://github.com/mne-tools/mne-hcp/blob/master/dictionary.txt It was added as part of the first commit.

Also, @dengemann or @alexrockhill would you have the datasets required to run the tutorials and examples? It's almost the last piece missing to get the build green.

Dennis was saying last time that all the data is stored on AWS so that the CIs can run it but we can't store them somewhere accessible because of dataset protections. You can download the dataset here though: https://db.humanconnectome.org/, it's not too hard to do, you just have to register an account and agree to the license.

@larsoner
Copy link
Member

@mscheltienne ideally the reformatting changes would be in a separate PR so they could be added to git-blame-revs-ignore. But maybe that's not worth it here.

I think we also want some very minimal CI runs -- I'll open a quick PR to add those first, then merging main into this PR should just make sure that things aren't more broken.

@mscheltienne
Copy link
Member Author

IIRC, I started to look into fetching the HCP dataset from AWS to test the changes in this PR and lacked the time. If someone has the time to figure out a gh-action to do so, we could add CIs and a doc build easily.

@larsoner
Copy link
Member

@mscheltienne I pushed a couple of commits, I've been testing locally with:

$ MNE_HCP_TEST_DECIM=1 pytest -rfEXs --cov-report= --tb=short --durations 30 -v --cov=hcp --cov-report xml hcp

on latest MNE main after running the new tools/get_testing_data.sh (~27GB data download!) with an appropriate ~/.s3cfg. I marked one test as xfail that we should probably fix, and there is a remaining error on hcp/tests/test_anatomy.py but I probably just need more files. Will test a bit more tomorrow.

@larsoner
Copy link
Member

... we should wait for the mne-hcp-testing-data.tar.gz from @dengemann he generated with this notebook if possible though. @dengemann if you can't find it I might be close to being able to regenerate it. Then might just upload it to osf.io as an encrypted archive and extract it using a password stored as a GitHub secret.

@larsoner
Copy link
Member

Okay a couple more commits, now everything works on my system with the full dataset! If anyone else wants to test you can download the ~27GB version then test with:

$ ./tools/get_testing_data.sh
$ MNE_HCP_TESTING_PATH=~/mne-hcp-data/HCP MNE_HCP_TEST_DECIM=1 pytest -rfEXs --cov-report= --tb=short --durations 30 -v --cov=hcp --cov-report xml hcp

@larsoner
Copy link
Member

@dengemann any chance you could look?

Without this dataset I'm not sure we can do much here maintenance-wise 😬

@agramfort
Copy link
Member

agramfort commented Nov 1, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants