You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have a dataset with specific artifacts, and are trying to apply tsss with mf_st_duration set to the signal duration. The problem is that the resting state run is much shorter than the experimental run, and we cannot set one duration that fits all signals.
What would be the best fix for this?
Can we set two versions of the mf_st_duration per task?
Thanks in advance!
The text was updated successfully, but these errors were encountered:
One option would be to internally use min(st_duration, raw.times[-1]) and document that behavior. Then in your code you could set st_duration = 10000 or whatever and it would automatically be trimmed to the duration of each run.
We have a dataset with specific artifacts, and are trying to apply tsss with mf_st_duration set to the signal duration. The problem is that the resting state run is much shorter than the experimental run, and we cannot set one duration that fits all signals.
What would be the best fix for this?
Can we set two versions of the mf_st_duration per task?
Thanks in advance!
The text was updated successfully, but these errors were encountered: