Stange bug when saving MEM results

Hello,

i know this is probably more a bug of either nirstorm or MEM, but since I am starting to question my sanity, I. thought I could come ask my question here:

TLDR: I am trying to localize 1000s of a signal using MEM, and for some reason, the 300s at the beginning of the signal are exactly 0.

Solved here: Stange bug when saving MEM results - #2 by edelaire

Long version:

I have 1000s of nirs signal, and I want to localize using MEM or MNE. But for some reason, the beginning of the file doesn't get localized and is set to exactly zeros; which makes no sense!

I tried to localize only part of the signal (between 100s and 200s) and then I got the signal:

But when I tried to localize the entire recording then: the beginning is forced to 0. After reading back the entire code 4 times, trying some voodoo magic, i just tried to put a breakpoint after the computation of MEM and before the saving in the database (here: nirstorm/bst_plugin/inverse/process_nst_cmem.m at master · Nirstorm/nirstorm · GitHub)

and to my surprise: The signal is there. I ploted the average over a specific roi using

idx = cortex.Atlas(1).Scouts(2).Vertices;
figure; plot(OPTIONS.DataTime, mean(grid_amp(idx,:)))

but if i then look at the file in the database, not signal is there between 0 and 300s

Left is if I plot the signal at the breakpoint before saving in Brainstorm, right is if I take the file from Brainstorm database. it should be exactly the same (and it is at the end).

edit: So I put a breakpoint here: just after the saving in the brainstorm database and the results doesn't make sense at all. nirstorm/bst_plugin/inverse/process_nst_cmem.m at master · Nirstorm/nirstorm · GitHub

I used the following code to compare what we are trying to save with what is save and the beginning of the file just get chopped of. I don't understand why.


    sCortex = in_tess(head_model.SurfaceFile);
    idx = sCortex.Atlas(1).Scouts(2).Vertices;

    sDataNew = in_bst_results(ResultFile);
    figure;
        subplot(121);
    plot(time, mean(data(idx,:)))
    subplot(122);
    plot(time, mean(sDataNew.ImageGridAmp(idx,:)))

Resulting in :

If you have any ideas, I would really appreciate it !
Thanks a lot,
Edouard

Ok. Found the issue. I changed the following line nirstorm/bst_plugin/inverse/process_nst_cmem.m at master · Nirstorm/nirstorm · GitHub
from bst_save(ResultFile, ResultsMat, 'v6'); to bst_save(ResultFile, ResultsMat, 'v7.3');
and it is working.

@Raymundo.Cassani It would be great to have a warning or error when bst_save doesn't work :slight_smile:

Is this an issue with respect to the size of the structure? size of ImageGridAmp field? or something else? Yes, a warning, or automatic handling would be good, but I'm not sure what to check for.

What happen if you load directly the v6 mat file in Matlab, does the sources have the good size, but zeros?

Currently, there is a test in bst_save to change the MAT-version to v7.3 if the larger than 2GiB. But I guess this is not the issue, that's is why it was partially saved in the v6 mat file

ok. i might have gone insane during the weekend. i just went back home; plugged my hard drive and opened Brainstorm and the files that were showing zeros earlier are just fine. like the data are here now... I mean I didn't invent the screenshot earlier..

I guess I'll go to sleep and try again tomorrow with v6.

yes it had the correct size but zeros at the beginning. at least that what I thought but now I am not sure of anything anymore.

I just checked and the files are 1.26 Gb on my hard drive so less than 2Gb; I don't understand.

well at least now it's working and the data make sense so maybe I should not look to much into that. Here is the figure after doing the trial average mentioned in the other topic

Can you share one of the defective v6 MAT files?

i have a deadline for submitting an abstract on that next monday but I will try to reproduce after the deadline and send you if I manage to do it :slight_smile:

so, so far I was not able to replicate the issue. but I was thinking of ways to reduce the size of the generated file (which is important for our lab since some of my colleagues want to apply wMEM on long data recording)

There is two way we can reduce the size of the generated file on the cortical surface. The first one is only applicable to nirs while the second one is useful every time someone uses wMEM.

  1. Saving only vertex in the FOV.
    In nirs, we only have a partial coverage of the cortex. (only where the sensors are installed). With a cortex having 15 000 vertex, we usually only reconstruct value on ~ 5 000 vertex (exact value depends on the nirs montage).

We currently fill the rest of the vertex with zeros:

grid_amp = zeros(nb_nodes, nb_samples); 
grid_amp(valid_nodes,:) = Results.ImageGridAmp;

Saving Results.ImageGridAmp (~ 5 000 x times) and the valid_nodes would save a lot of space.

  1. wMEM is doing compression.

When using wMEM, the inverse problem is solved after a time-frequency decomposition of the data. Each time-frequency box is then localized on the cortex.

Interestingly, the time-course of the signal can be reconstructed using a matrix product: best-brainstorm/best/mem/be_launch_mem.m at wMEM · Edouard2laire/best-brainstorm · GitHub

ImageGridAmp = (P x ImageSourceAmp') ' = ImageSourceAmp * P' where ImageGridAmp is nVertex * nTimes, P is nTimes * nBox, and ImageSourceAmp is nBox x nVertex. (knowing that P can be easily recomputed or stored alongside the data)

Depending on how many scales are localized, this can reduce a lot the number of samples localized. On the example I am currently running, this means localizing ~ 3 000 boxes instead of ~ 11 000 times points ( ~1 100s at 10Hz)


Using those two facts, we can go from saving a matrix that is 15 000 x 11 000 = 165 000 000 elements
to 5 000 x 3 000 = 15 000 000. so an 11x reduction in size.

Do you think it would be possible to implement those two strategy in Brainstorm? I guess for the second one it is very similar to having a kernel in MNE. expect that the data are already on the cortex and we just change from time-frequency representation to time representation using a projection matrix P.