When doing t-tests between two conditions in source space, I am getting very different answers with BrainStorm compared to MNE. Admittedly, there are many things I could not (or did not) hold constant across the two platforms, but I did try to keep the analyses as similar as possible. That is, both used weighted-MNE for source reconstruction. Both did source reconstruction separately for each run to account for likely head movement between runs, and were averaged by condition in MNE and baseline corrected (but not z-score normalized to the baseline). Also, both t-tests were done assuming unequal variance, and for each condition the absolute value was taken before calculating the mean.
Attached (if it works) is a pdf file with 3 slides of histograms. The first shows t-values across all vertices for the comparison of condition 1 (high-frequency words) with condition 2 (low-frequency words). The values are centered around -1 in MNE (the actual t-test was calculated with a custom script I wrote in python, also attached as a text file) and around 1 in BST. This is a striking difference that is also easily seen in the cortical maps.
To try to track this down I took the first 3 subjects only (slide 2). Here the difference is less pronounced, but is definitely beginning to emerge. I then looked at wMNE values from a single subject (slide 3), and it turns out that these values also differ quite a bit between the two software packages.
I really hope one of both of you can give me some guidance as to where to begin tracking down the cause of these differences. No doubt you’ll need more info than I’ve provided here, so please just let me know.
Thanks in advance, and I hope you have a very happy new year!
indeed we’re going to need more details. Did you use loose or fixed orientation? if loose how much?
If you used loose, you most probably did the t-test after pooling the 3 orientations (ie. on the magnitude of the dipole)
is BST doing the same?
Did you morph the same way with both approaches?
Alex
PS: rather than using Rpy to do a paired t-test, you can simply use scipy.stats.ttest_rel (scipy.stats.ttest_rel(amat, bmat) would be much faster than the for loop)
We touched on this a little bit with a previous question I posted to the MNE list. Basically, I ran mne_do_inverse_operator using depth weighting. I specified neither the “–fixed” nor the “–loose” options. The manual says, “By default, the source orientations are not constrained.” What you previously concluded about that was the following: “I think you have used ‘free orientation’ which is equivalent to loose = 1.0. (loose=0 means fixed orientation).”
I can’t recall whether or not BST uses loose or fixed orientation by default. I used whatever the default is (including depth weighting). I’ll check the default setting the next time I’m at my lab computer, unless Francois can tell us before then.
As for morphing from individual to group space, all the histograms I sent are based on values in group space. As I understand it, that process is very different between BST and MNE, with MNE using surface-based warping and BST using a simple piecewise linear transform. Please correct me if I have those details wrong.
As for my t-test script, I originally used the scipy.stats.ttest_rel funciton, but from the documentation I couldn’t tell if it was assuming equal or unequal variance, so I re-wrote my script to use the R t-test function which has a parameter for setting whether the variance is equal or unequal. Turns out it gives exactly the same answer as the scipy.stats.ttest_rel function, but I just haven’t bothered to change it back.
On the Brainstorm front, the MNE estimate is performed with fixed source orientations. From what you are saying about the default options of MNE, the orientations are left unconstrained, hence both models are different. I would recommend you either switch BST to loose orientation (to be selected after you click on the ‘Expert mode’ button in the ‘Compute source’ GUI) or switch MNE to constrained source orientation. I would vote for the latter as handling unconstrained orientation can be a bit tricky (we can check that together later).
I agree that trying the fixed orientations approach in MNE is a better option for now, particularly since I have the pipeline scripted in MNE (haven’t managed to finish doing that yet in Brainstom).
I’ll start that soon and let you know how it goes.
OK, I ran “mne_do_inverse_operator” and explicitly specified the “–fixed” option. The only thing I can figure is that this must be the actual default (as opposed to “loose = 1.0”) because when I contrast the high frequency word condition with the low frequency one for 20 subjects, I get what looks like exactly the same distribution of t-values as before. The same goes for the actual MNE values for one subject in each condition. The corresponding slides are attached as a pdf.
So I still don’t know how to account for these divergent results. Other salient differences between the MNE and BST analyses that occur to me are that in MNE I used what I think was a single-layer boundary-element model for the forward model, while in BST I used an overlapping spheres model. I’d be surprised if that accounted for the differences I’m seeing, but I really don’t know. Also, in BST I used “absolute values” when projecting sources to group space. I don’t know what MNE does by default.
Anyway, I would appreciate any suggestions as to what to try next.
when you use --fixed with MNE you get signed values. So you should not use “absolute value” in BST if you want to compare.
Otherwise just add a np.abs in the MNE python script.
Let me know if it solves the problem.
I too had this concern about making sure to use absolute values for t-tests in both MNE and BST, so I took the absolute value at each vertex, as shown on lines 31 and 32 of the attached script. This was already reflected in the results I posted at the beginning of this thread.
In my previous post I was referring to the other point at which BST makes the user choose whether or not to use absolute values, which is at the stage of projecting sources from individual to atlas space. This is not a choice that MNE seems to require the user to make, and I don’t know what the default is, or whether it would make a difference. I guess I’ll just try generating group results in BST using signed rather than absolute values for the projection and see what happens. In the mean time, thanks for your help so far, and I’d appreciate any other suggestions.
That makes sense, and it inspired me to compare the weighted MNE values at the level of the individual subject space before applying any morphing or smoothing. Again, the same input files were used in MNE and BST, where I used “mne_process_raw” to calculate means across trials for each condition. As far as I can tell, the only things that were done differently between the packages is: 1) the forward models are different, and 2) the covariance matrix was calculated within each package, rather than using the same one for both (just because I didn’t bother to figure out how to import the one calculated in MNE into BST).
The results of looking at histograms of the weighted MNE values before morphing or smoothing are in 2 attached png files (the only difference between them is that they each represent a different experimental condition). One thing to notice right away is that the values from MNE (upper panel) peak around 2 (*10^-11) and are all positive, while the values from BST (lower panel) peak around 0 (non-zero values are also of magnitude 10^-11) and extend in both positive and negative directions. This made me think that the baseline is being treated differently in BST compared to MNE, but the only way I can find to baseline correct in BST is by using z-scores, and the results of that would not be directly comparable to the results from MNE.
I feel like these results should be closer to each other than I’m seeing. Do others agree? Any ideas on what I’m doing wrong and/or what to try next?
regarding the noise covariance, did you compute it the same way? on single trials? or after averaging?
looking at the histograms I observe:
MNE values are all positive which means an “abs” has been applied (look at the --signed option of mne_make_movie unless you used Python to apply the inverse operator).
BST values are centered around 0 means a z-score might have been applied.
to sum up, the comparison would be fair with no z-score and signed outputs with MNE.
One last thing, make sure the parameter of depth weighting is the same.
Will:
If you want to remove some degrees of freedom in your comparison, in Brainstorm you can:
Use the noise covariance you used in MNE. Right-click on the folder in which your noise covariance matrix is > Noise covariance > Import from file
Use an OpenMEEG single-shell BEM model
I would say it’s normal to have the distribution of source values centered around zero at one single time point, it doesn’t mean there is a z-score applied to the values.
OK, when I run “mne_make_movie” with the “–signed” option, things look a lot more similar across the two platforms. One remaining difference is that the values are scaled slightly differently across the two, with values between -3 and 4 *10^-11 for MNE, and between -0.8 and 1.0 *10^-11 for BST. Francois, what I’ve been attaching to earlier posts are histograms for values across all time points and all vertices.
Anyway, I’m trying to use OpenMEEG to generate a single-shell BEM model, as you helpfully suggested, and I’m running into a problem. I can select it and start downloading. It seems to finish downloading, but when it starts the installation, it quits with an error. Here’s the full output from the Matlab terminal:
[QUOTE=wgraves;3276]OK, when I run “mne_make_movie” with the “–signed” option, things look a lot more similar across the two platforms. One remaining difference is that the values are scaled slightly differently across the two, with values between -3 and 4 *10^-11 for MNE, and between -0.8 and 1.0 *10^-11 for BST. Francois, what I’ve been attaching to earlier posts are histograms for values across all time points and all vertices.
[/QUOTE]
the scaling difference can typically be due to a different lambda ie. a different SNR. To get lower values with MNE, decrease the SNR value with mne_make_movie
for OpenMEEG what OS are you using (linux, mac, windows)?
Thanks for the tip about the SNR. It looks like both MNE and BST use a default of 3, so I’d be surprised if that accounted for the difference, but I’ll keep it in mind.
As for my OS, I’m running Linux (specifically, Ubuntu 10.04 LTS).
Thanks for the tip about the SNR. It looks like both MNE and BST use a default of 3, so I’d be surprised if that accounted for the difference, but I’ll keep it in mind.
As for my OS, I’m running Linux (specifically, Ubuntu 10.04 LTS).[/QUOTE]
Hi Will,
The difference, if it’s just a scaling, might be due to a possible difference on how the source covariance matrix is adjusted in MNE vs BST when the data has been processed by SSS. The rank of the data after SSS is lower (usually 64 if using regularization and the default 8 and 3 orders of the spherical harmonics expansion). If I remember correctly, in BST, the source covariance is adjusted so that the trace of L_wC_jL_w’ is equal to the rank of the data (and noise covariance), and maybe MNE is setting up so that the trace is equal to the number of channels. This is just one possible difference.
Rey
Thanks Alex. I tried copying the files from the location you provided to the directory where Brainstorm expects the OpenMEEG files to be (in my case, /home/wgraves/.brainstorm/openmeeg/linux64). Since it looked at first like only the bin and lib files were there from before, that’s all I copied. That didn’t work, so I tried copying the entire directory structure from the un-compressed tarball to that location, and that didn’t work either. Here’s the error I get:
** Error: OpenMEEG error #127:
**
Oddly, it looks like the same error as before. If I’m doing something wrong, I have no idea what it is. I’d really appreciate any help you could give.
./om_assemble: error while loading shared libraries: libOpenMEEG.so.1: cannot open shared object file: No such file or directory
Which is odd because I do see that file in the current directory. However, I just noticed that one level down in the “lib” directory, libOpenMEEG.so.1 is a soft link to libOpenMEEG.so.1.1.0. But that’s not the case in the top-level “linux64” directory. Anyway, I just made all the soft links in the linux64 directory look like the ones in the lib directory, and I still get the same error.