Hello,
I’m currently working on processing test data I collected for a project. I initially analyzed it using Homer3, and the results looked as expected — I observed a nice increase in HbO and a corresponding decrease in HbR across all three conditions.
Since I’m particularly interested in source localization, I tried analyzing the same data using NIRSTORM. However, when processing the data with NIRSTORM, the HRF I obtain looks very strange across all conditions — it’s either flat (zero) or shows oscillatory activity that doesn’t make sense. I suspect I might be doing something wrong and wanted to check if my processing steps are appropriate.
Here’s what I did:
- Data acquired using Brite Connect from Artinis (.snirf file)
- Imported into NIRSTORM
- Removed bad channels using SCI
- Exported to delta OD
- Applied motion correction (TDDR), band-pass filtering (0.01–0.1 Hz), and superficial signal correction (SSC)
- Converted OD to delta HbO/HbR
- Imported trials (-5000 to 10000 ms, DC correction, change my events to single events)
For source localization, I also tried importing the corrected OD trials and running wMNE, but the results were again odd — I see a decrease in HbO, whereas an increase (as seen in Homer3) would make more sense given my paradigm.
I understand that the pipelines between Homer3 and NIRSTORM are quite different, but I was expecting a similar general trend (i.e., HbO increase). Do you see anything in my pipeline that could be causing this discrepancy?
Thank you for your time and help!
Edit: I've uploaded the raw data here if that helps: Data_fnirs_test - Google Drive
Best,
Maxime