Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Motion-BIDS #1145

Open
JuliusWelzel opened this issue Jun 2, 2023 · 19 comments
Open

Motion-BIDS #1145

JuliusWelzel opened this issue Jun 2, 2023 · 19 comments
Labels
enhancement MNE-Python needs interaction with MNE-Python

Comments

@JuliusWelzel
Copy link

Describe the problem

Hello,

as of March the BEP029 for Motion data has been merged into the BIDS specification. There are still some minor details to be finalized, but the core specifications are done.

We would like to extend the functionality of MNE-BIDS to motion data to provide a Python-based tool for converting sourcedata to BIDS. A MatLab based version is already implemented in Fieldtrip with the data2bids with the help of @sjeung, @robertoostenveld and @helenacockx.

Having MNE-BIDS support motion data would make converting data a lot easier for non MatLab users.

Describe your solution

Implementing import functions from the most common used motion data types for MNE and update write_raw_bids to be able to use the format: 'motion'.

Describe possible alternatives

Prove a more flexible solution without import functions to import motion data in an MNE Raw object and assign relevant metadata to be able to convert data to BIDS.

Additional context

No response

@welcome
Copy link

welcome bot commented Jun 2, 2023

Hello! 👋 Thanks for opening your first issue here! ❤️ We will try to get back to you soon. 🚴🏽‍♂️

@sappelhoff
Copy link
Member

Hi @JuliusWelzel thanks for the proposal :-)

perhaps looking at the PR that made NIRS available in MNE-BIDS can be helpful: https://github.com/mne-tools/mne-bids/pull/406/files

what exactly will you need mne-bids to do?

Have you worked with mne-bids before? If not, I'd recommend you have a look at the examples and execute them yourself / play around with them ... or perhaps try to convert one of your EEG datasets (if you have one) to BIDS via mne-bids.

@JuliusWelzel
Copy link
Author

@sappelhoff Thank you for the help. I Converted a few EEG datasets via MNE-BIDS. NIRS and other data have dedicated formats to be read into MNE as far as I understand it. This is not the case for motion data I believe. Would it make sense to think about a motion-MNE reader as a first step? The problem is, there are a lot of different motion datatypes available.

@sappelhoff
Copy link
Member

Would it make sense to think about a motion-MNE reader as a first step? The problem is, there are a lot of different motion datatypes available.

indeed, it'd be good for mne-bids to supply one (or maximally "a few") data format(s) from which to do the conversion. Would it be possible to represent a motion dataset in a MNE-Python .fif file? Then you could require people who want to work with mne-bids and motion to first get their data into a .fif format (in some way, dependng on their particular file format; you could write an "example" for a workflow on that) ... and then mne-bids could only work on the .fif format.

does that sound like a possibility? Otherwise you are right ... we'd need readers for each motion file format that you want to support, and we would need to implement these readers outside of mne-bids (and possibly even outside of mne-python, as a separate package)

@JuliusWelzel
Copy link
Author

I will check if that is possible. It still would be good to write import functions for .c3d and .hdf at some point. @SjoerdBruijn generously offered support at this end :)

@JuliusWelzel
Copy link
Author

I've looked at a couple of options to get motion data into MNE. One solution would be using the .edf format and the PyEDFlib. Another option would be to pass motion data to a MNE object similar to importing data from .xdf-files. The challenges remains with the endless formats that motion, data is recorded.

Would it be possible to add a dedicated motion channels to the MNE channel types?

Any feedback is appreciated :)

@robertoostenveld
Copy link

For the mne-bids converter you indeed first need a python-based reader supported in MNE-Python. AFAIK the EDF format is not used by any motion capture systems, so I don't think that is of use as it just shifts the problem to "How do I convert my data in Python from .c3d (or some other format) to .edf?"

The first MNE-Python importer that I would write is one that imports the data from the motion-BIDS format, i.e. one that reads the tsv+json files and represents it as MNE-Python data in memory.

@JuliusWelzel
Copy link
Author

AFAIK the EDF format is not used by any motion capture systems

That is true indeed. The idea is to provide Python import function for the most common used formats in the future.

The first MNE-Python importer that I would write is one that imports the data from the motion-BIDS format, i.e. one that reads the tsv+json files and represents it as MNE-Python data in memory.

That would require MNE to accept motion as a channel type?

@sappelhoff
Copy link
Member

sappelhoff commented Jun 19, 2023

what exactly would a motion channel type look like? How would it be different from the channel types that are already supported in MNE-Python? --> https://mne.tools/dev/overview/implementation.html#supported-channel-types

The first MNE-Python importer that I would write is one that imports the data from the motion-BIDS format, i.e. one that reads the tsv+json files and represents it as MNE-Python data in memory.

I think we agree that for now it's important that we clarify how to represent motion data in MNE-Python data structures.

Once we have that, we can work on several ways how to go from raw motion data to MNE-Python data structures. (Note that here I would place a slightly different focus from Robert: I'd try to prepare examples how to easily convert from some raw motion data to MNE-Python data, instead of directly from BIDS to MNE-Python data, ... because the latter would require people to organize their data in BIDS, which is something we want them to do automatically via mne-bids)

Once that is done, converting to BIDS via mne-bids would be easy.

@JuliusWelzel
Copy link
Author

The current specification of motion channel types can be found here. There is some overlap (mag), but other channel types define positions, orientations, or their time derivatives, which is not reflected in MNE's supported channel types. I am not sure how closely related MNE-BIDS and MNE are, but for future BIDS updates, it might be useful to allow reasonable channel types from BIDS in MNE, such as motion. That would of course mean extending the units in MNE as well.

For now, all relevant information for motion data from a *_motion.json and *_channels.tsv can be represented in a MNE Raw object, if the motion channel types are accepted in MNE.
I could provide an example of loading motion data as bio channels in MNE and we work from there?

@robertoostenveld
Copy link

You may want to have a look at https://www.fieldtriptoolbox.org/example/bids_motion/ for an example, the data (sourcedata including data in 4 original formats and bids using data2bids) is available from here https://download.fieldtriptoolbox.org/example/bids_motion/.

Please note that I did not review this example for quite some time and hence it might be outdated wrt details that have been updated in the spec. If you find something that is not correct, please file an issue on https://github.com/fieldtrip/website/issues so that I can fix it.

@JuliusWelzel
Copy link
Author

it might be outdated

@sjeung and I are finalising the Motion BEP and example datasets and will get back to this fieldtrip example once everything is done

@sappelhoff
Copy link
Member

I could provide an example of loading motion data as bio channels in MNE and we work from there?

Yes! Please try to work with what's already there and identify the pain points (if any) -- then we'll take it from there and see what needs to be adjusted in MNE-Python.

@JuliusWelzel
Copy link
Author

JuliusWelzel commented Aug 14, 2023

Hello, I have put some code and data for testing here.

The repo contains a sample dataset that we typically get from a cheap IMU system.

The metadata and data are loaded into some dataclasses and then transfered to an MNE Raw object.

When trying to use write_raw_bids, I get the following error:

"ValueError: The specified datatype beh is currently not supported. It should be one of either meg, eeg or ieeg (Got beh. Please specify a valid datatype using bids_path.update(datatype="<datatype>")."

So I think, motion datatypes should be allowed as inputs and then the main data should be written to a *.tsv-file. This is where I need some support or guidance.

@agramfort
Copy link
Member

agramfort commented Sep 30, 2023 via email

@JuliusWelzel
Copy link
Author

Hello,

thanks for the support!

I would add motion as a channel type here and add related units here. Any other part of the code which I need to take care of?

@adam2392
Copy link
Member

Just chiming in here.

Yes that is what I suspect would be the best. This would require you to have a minimal example dataset that you can use for testing and probably some downstream PRs to document this new data-type in MNE-Python. I would open up a issue/PR there to begin.

@JuliusWelzel
Copy link
Author

Thanks @adam2392 for the pointer. I already uploaded some test data before with code how to get this example into python #1145 (comment). Feel free to have a look and give feedback, in the meantime I will open a PR.

@JuliusWelzel
Copy link
Author

I have opened the PR. Please let me know, how to implement different types of motion channels. I started already coil types and would like to automatically get channel type constants here.

@sappelhoff sappelhoff added the MNE-Python needs interaction with MNE-Python label Nov 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement MNE-Python needs interaction with MNE-Python
Projects
None yet
Development

No branches or pull requests

5 participants