Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MAINT] Update to latest MNE version #66

Closed
wants to merge 3 commits into from

Conversation

alexrockhill
Copy link

Related to https://mne.discourse.group/t/hcp-data-compatibility/4433.

MNE-HCP isn't maintained rigorously but Andy (sorry don't know your GitHub handle) is interested.

I thought I would get the ball rolling with a PR for an example.

First, I'm just adding an empty commit (usually, you would make you changes and then do git add -A for all the files or add each changed file individually from git status and then do git commit -m "your message goes here about what you changed") to see if the existing mne-hcp tests will point out the failures that we'll need to change.

@alexrockhill
Copy link
Author

Uh oh, looks like no tests. From here, I will make a minimally reproducible example from the data that Andy wanted to download and possibly implement it as a test.

@alexrockhill
Copy link
Author

Ok I looked around a bit, and it looks like the GitHub Actions are out-of-date. Andy expressed interest in updating this during a Sprint, which I think fixing some of the code aspects wouldn't be too hard but the GitHub actions stuff really involves a lot of setup. Andy, you could try copying things over from a currently-maintained package like mne-python or I think better would be mne-bids because it's a bit more lightweight. I'm a bit hesitant to recommend that you go down this rabbit hole though because it's a lot of work, and it would help you learn about the MNE environment, but I'm not sure the returns are worth it when you can just use an older version of MNE (e.g. 0.19) and get the data. You time might be better spent contributing to things that have a bit more return. Although it would be super cool if this were functional again, it's definitely a danger of being a time sink and not a quick fix.

@alexrockhill
Copy link
Author

Hmmm okay, I gave it a quick try migrating the tests over to a more modern framework so that it could potentially be updated but I ran into a bit of a roadblock because I couldn't get the testing data from s3cmd. I think it's somewhat close and then Andy could spend a Sprint writing PRs to fix all the tests and examples if he was so inclined if we had the testing data (might need it's own repository like the other MNE testing data) but I'm going to stop here for now. If @dengemann maybe had a second to share that, then we could maybe update it.

@andylikescodes
Copy link

@alexrockhill Yes, I would actually love to do that. Actually, I don't need to wait until the sprint, I can start looking at it now. It is an interesting project for me to learn how it works in general. I might spend an hour or so a day to start looking at it. I am quite new to contributing to an open source project, although I have been using Github for a long time for collaborations within my lab. So I might be a bit slow at the beginning trying to figure things out, e.g., how to setup tests and Github Actions. I will let you know if I have any questions. I will probably attend another MNE office hours to discuss more about this. In the meantime, if you have any good resources that I can go to and read more about this, it would be great. Thank you!

@alexrockhill
Copy link
Author

@agramfort, do you have a copy of the testing data for this repo? It doesn't seem like too much work to get it up-to-date and it would be a shame not to try and maintain Dennis's work at least a bit. I haven't set up s3cmd so I don't have access to it. Sorry to bug you.

@dengemann
Copy link
Member

... I'd need to dig somewhere. I don't remember well if there was even a script somewhere to generate the testing data. Unfortunately, they are not on my current computer, I will need to do some archival work.

@alexrockhill
Copy link
Author

... I'd need to dig somewhere. I don't remember well if there was even a script somewhere to generate the testing data. Unfortunately, they are not on my current computer, I will need to do some archival work.

If you had a minute to do that, it would be super helpful! You could maybe just get it off of AWS with s3cmd...

@dengemann
Copy link
Member

won't be a minute unfortunately :( but I'll look into it within next week.

@andylikescodes
Copy link

@alexrockhill Here's the post that I intended to post on the MNE forum to reply your response (https://mne.discourse.group/t/hcp-data-compatibility/4433/5). As you suggested, it might be good to move our conversation here:

The HCP data for one subject is really large. The one for testing that I have on my laptop is more than 50G, and I am not sure how I can share this with you.

Basically, I just follow the tutorial here to extract the anatomy from the HCP data: https://mne.tools/mne-hcp/auto_examples/make_mne_anatomy.html#sphx-glr-auto-examples-make-mne-anatomy-py

The MNE-HCP seems to provide a good API for running source localization: https://mne.tools/mne-hcp/auto_tutorials/plot_compute_evoked_inverse_solution.html#sphx-glr-auto-tutorials-plot-compute-evoked-inverse-solution-py

And they also provide their own function of plotting the co-registration: https://mne.tools/mne-hcp/auto_examples/plot_coregistration.html#sphx-glr-auto-examples-plot-coregistration-py

However, they mentioned that for various reasons, they didn't use the MNE coordinate for the source localization, but they didn't mention explicitly why. And this creates some problem while visualizing the HCP data on MNE: https://mne.tools/mne-hcp/generated/hcp.preprocessing.map_ch_coords_to_mne.html#hcp.preprocessing.map_ch_coords_to_mne

I am trying to study the source code of MNE-HCP and understand how the process is done. I might attach some code later if I found them useful to clarify. I am curious about why they didn't choose to use the MNE coordinate in the first place? It might be a good clarification question for Denis, but do let me know if you know anything about this.

@alexrockhill
Copy link
Author

Yeah, I haven't used HCP data so I have no idea about choosing coordinate frames but I'm happy to work through examples with you.

The example data Dennis shared is ~400 MB so that'll be what we want to work with. I assume he downloaded an entire dataset and cropped everything down as short as possible so that you get all the formatting without huge file storage.

@andylikescodes
Copy link

Yeah, I haven't used HCP data so I have no idea about choosing coordinate frames but I'm happy to work through examples with you.

The example data Dennis shared is ~400 MB so that'll be what we want to work with. I assume he downloaded an entire dataset and cropped everything down as short as possible so that you get all the formatting without huge file storage.

Sounds good. I will come to the MNE office hour this Friday. I think we can take a look at it together then.

@alexrockhill
Copy link
Author

So I think the resolution was that nothing actually needs to be fixed, the incorrect orientation was by design because HCP puts everything in BTI coordinates and those are ALS not RAS. Closing this, as I don't think it's necessary to update the tests seeing as we can't have local access to the data, they should just stay as is. If you want to walk though how to do everything in BTI coordinates next office hours, I'd be happy to help with that @andylikescodes

@mscheltienne
Copy link
Member

FYI, I started an update of the package here to get it compatible with recent MNE versions: #71
I don't intend to get tests that require a dataset running.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants