Exporting Source Activations to Matlab

Hi Brainstorm Team,

I have a quick question concerning the export of source activations to matlab.

I have used an unconstrained sLORETA to reconstruct the cortical activation based on a cortex with 15002 vertices. I then exported the Kernel and the data to Matlab to compute the source activations. After the export, I get a Kernel with 45006 x nSensors elements, which is 3*nVert. To analyze this further, I would like to know how this vector is organized: Are the three dipole directions always adjacent in the vector (i.e. Vert1_1, Vert1_2, Vert1_3, Vert2_1, …) or are all vertices of one direction adjacent (i.e. Vert1_1, Vert2_1, Vert3_1, …, Vert1_2, Vert2_2,…)?

Is there by any chance a way to use the functions implemented for the scout activations (mean, PCA, power, etc) on every vertex from within matlab to get an overall activation map?

Thanks a lot for your help, I look forward to hearing from you
Tim

Hi Tim,

The rows of the imaging kernel are organized vertex by vertex, with three sources at each vertex: [V1x, V1y, V1z, V2x, V2y, V2z, …]

The function that calculates the scouts functions is bst_scout_value, but it is probably quite complicated to call it manually from a script.
What result are your trying to calculate or display?

Francois

Hi Francois,

first of all, thanks for the fast response and help.

I implemented a cluster-based correction for multiple comparisons (similar to the permutation test implemented in freesurfer) and wanted to run it on the statistics extracted from an unconstrained source reconstruction.

In order to deal with the three dipoles estimates at every position, I decided to start with a simple vector norm as approximation, but I was was also interested in the other possibilities (power, PCA, SVD, etc). Your pointer to the scout function helped for this, I can check the code and see how the different methods are implemented.

If you have suggestions or comments, they are always highly welcome. Of course, I am happy to share my code with you if you are interested.
Tim

PS: There was a related post recently in which it was suggested to estimate the null distribution by randomly sampling p-values from a uniform distribution and smoothing appropriately to resemble the smoothness in the original data. However, I could not find whether BS provides a function to estimate the smoothness in the original data?

If you are already working outside of the interface, you can call the function bst_scout_value from your script.
The parameter you are interested in is “XyzFunction”, which defines how the values for the 3 orientations are combined into one value per vertex. Only three options are available: ‘norm’, ‘pca’, ‘none’.
Norm takes (x^2+y^2+z^2) at each time point, PCA takes the first mode of an SVD decomposition.

Do you mean spatial or temporal smoothing? It can be interesting to smooth a bit the individual source maps on the cortex surface before doing any group analysis, so that there is a higher chance of overlap between subjects (process Sources > Spatial smoothing).
There is no function to estimate the smoothness of data.

Hi Francois,

The smoothing was meant in the spatial domain. I do smooth the data after projecting to the default anatomy (Colin27), to correct for smaller errors in the spatial alignment.

The overall goal is to correct for multiple comparisons on the cluster-level (as an alternative to FDR). For instance, one can run a permutation test (this is what I implemented) or alternatively do a monte-carlo simulation in which p-values are sampled from a uniform distribution for every vertex (this was suggested here). In the second case, however, the smoothness of the original data and the random samples is vastly different such that found clusters cannot be compared. Thus, the surface with randomly-sampled p-values needs to be smoothed to match the smoothness in the original data. For this, we need to estimate the smoothness in the original data. In Freesurfer, this is accomplished with an AR-1 model.

From the other post on cluster-based corrections in the forum (the one I linked above), I got the impression that an estimation of the smoothness was already implemented in BS:

% generate the smoothing kernel
[surf.W,surf.Y,surf.Z] = tess_smooth_sources(surf.Vertices, surf.VertConn);

However, this function seems to have changed (the number of return parameters is different), and, apart from this, I was unsure whether it performed a smoothness estimate in the first place. Am I right, that it does not? As far as I can see, this function performs the smoothing operation, but does not estimate the smoothness in an existing surface…

Cheers
Tim

Hi Tim,

Indeed, the syntax of tess_smooth_sources changed and is documented in the very code of the function:
W = tess_smooth_sources(Vertices, Faces, VertConn=[], FWHM=0.010, Method=‘average’)

This function returns a smoothing kernel W, which is a linear operator [Nsources x Nsources] that you apply on your source matrix to smooth them over the surface. It does not estimate anything.

Cheers,
Francois

Hi Tim and Francois,

I have been following this conversation, but im a little lost (fairly new to the research). I am trying to export the source activity as well to MATLAB, by source activity I mean I want to export the scout activity (ie: calculate the time series activity from cuneus found in the desikan-killiany atlas, and export that to MATLAB). I have tried this manually, where i compute the sources using minimum norm imaging with automatic shrinkage; using sLORETA; however after obtaining the sources and plotting the scout activity, i notice that the number of points in the scout is less than the number of samples ;my EEG data is [64x21570 double], one scout has a size size(scouts_walk2.F{1})

ans =

       1       10785

after exporting to matlab all scouts. What is the reason for this?

Hello
Sorry for the response delay.
It looks like you have exported time series for 1 scout, averaged over all the vertices within this scout.
I’m not sure why you have 10785 time points. Have you selected only half of the time definition, or have you downsampled it?
What is your question exactly?
Francois

Hey no problem i actually figured it out; apparently under the preferences options, i had “Downsample to view data faster” checked, thus only half the data was showing. I got the data working though, and it all adds up properly.

Avoid using the menu that saves data from a figure.
Prefer processes like “Extract > Scouts time series” or “Extract > Extract values”.

How would I be able to accomplish this script wise? I have a big study (>1000 trials) and I’m trying to export all the source data

  1. You can either select your 1000 files in the Process1 box and use the same process “Extract > Extract scouts time series”, with the option “Concatenate” NOT selected
  2. Or leave your Process1 list empty and use the selection processes (File > Select files: Sources) to grab files by subject/folder/tag. Generate the corresponding Matlab script, and use the corresponding call in your scripts.

Does it answer your question?

Yes, the first method worked like a charm, thank you kind sir!

  1. You can either select your 1000 files in the Process1 box and use the same process “Extract > Extract scouts time series”, with the option “Concatenate” NOT selected

I would like to ask how this is to be done? It seems that concatenate signal or time must be chosen.

I would like to ask how this is to be done? It seems that concatenate signal or time must be chosen.

It looks like you are not referring to the same process:

image

1 Like

Thank you very much for your answer, this really solves my current problem.
I will try this method and idea.