I think I’ve run into a bug in the Bivariate Granger Causality (NxN) routine and I hope you can help me with it. I’m pulling scout time series from a 2-second data segment, but I notice that sometimes one or more scouts return a series of all zeros. As soon as the GC code reaches that channel(s), it stops and fills the entire causality matrix with zeros. I have other examples in my analysis where this happens; the figure attached is just one case.
In my view, the calculation should skip or handle the zero series and keep going. Only the rows and columns for frontalpole R, for example, should be zero.
Just following up on this issue I raised regarding the Bivariate Granger Causality (NxN) routine and how it handles zero time series. I understand you may be busy, but I’d appreciate any thoughts or updates when you have a chance.
Let me know if I can provide anything else to help.
Thanks for the shared data, it was useful to identify the bug that you reported.
This was not really related to the GC itself, but to the fact that signals are standardized (mean 0, std 1) before computing GC, so, flat channels were giving troubles.
Another issue that raised in the data, is that sometimes the ratio restricted_variance / unrestricted_variance was smaller than 1, which suggest that the model that considered x and y performed worse than the model that considers only x. In theory this should not be the case, it seems it is related to numerical issues, as in the cases this happens the ratio (restricted_variance / unrestricted_variance) - 1 is in the order of 10^-15.
The fixes for this are located in these links, these lines of coded are not in the main Brainstorm code, as we need to do some more testing.
Thank you so much for the huge effort and commitment in solving this issue. I’ll start working on this fix right away. Really appreciate the great work!
The mvgc plugin is correctly installed, however I believe it is something related to the new method linked to the mvgc toolbox. In fact if i run the N x N granger causality with the old method it works. Any suggestion?
You may want to check your data, in order to have that error it can be that:
In the data, the number of channels is higher than the number of time samples, or
In the data, there are channels that are a linear combination of other channels (i.e., they are linearly dependent)
The older method ( unconditional GC) may not be returning an error, but the computation of unconditional GC can lead to spurious causalities. These spurious causalities are addressed by computing GC with the MVGC toolbox. That is the reason, why the MVGC is the recommended approach.
I am using 6 minutes of distributed source EEG at 250Hz. I have reconstructed with sLORETA the EEG using the kernel. Thefore I pass into the GC the link source. I have used an individual anatomy dowsampled to 15000 vertices. I then apply the dowsample to atlas so basically I have 68 (regions) x 90000 (time points). Atlas timeseries is not obtained before, but during the GC computation directly from the process. In this scenario the point 1 cannot be possible.
Potentially the source inversion and the downsample to Atlas result in a linear combination between channels? If this was the case, it would mean that the distributed source approach are not applicable to conditional GC, given the introduction of interdependence between channels (vertex or ROI)? I am sorry but I am not sure how to properly reply to your point 2.
That is right. The time series in each vertex are a linear combination of the time series in the channels. And Scout time series is a linear combination of the time series in the vertices inside the Scout (if "mean" is used as Scout function). I.e., the rank is given by the number of sensors.
Conditional GC could still work if the number of scouts is less than the number sensors, but this is not warrantied. One thing to test would be to extract the Scout time series with the process Extract scout time series and check its rank.
I tried as you suggest namely, to extract the time series from the link source and then checking the matrix rank. I hope I used the correct function and I checked correctly. I attach the figure showing that when checking the rank with the MATLAB rank(Matrix) function I obtain 68.
At this point I am not sure where the error could be, since every scout is independent form the other based on the rank, in principle the function should work.
If I am correctly the ROI time series matrix cannot by definition being positive definite given that it is not square. This is at least my understanding of algebra, which is limited for sure.
What can be done in this case? Maybe the error it is not related to what we think?
If you want I can pass a test data to try yourself to obtain the same error.
Let me know
Thanks