Running out of memory when averaging source data

Hello

I've run an EEG experiment where I elicited different emotions for epochs of up to 60 seconds. I'm comparing different emotion states to a neutral state.

I'd like to run a source localisation - both on the frequency bands that are elicited in the different emotions, compared to the neutral state, as well as a general source localisation that isn't restricted to the specific frequency bands, in an attempt to localise activation of specific brain regions during specific emotions e.g. the amygdala during fear, or the insula during disgust.

I've imported the epochs, computed a mixed BEM head model, including the subcortical regions the amygdala, hippocampus and caudate, and computed sources. I'm using the default anatomy for all of my subjects.

My plan for the source localisation analysis for each participant is to average the source files of the trials for each of the emotions (7 emotions and 1 neutral state, with 3-4 trials for each emotion), and then average these across time for the duration of the emotion epoch (sometimes these were less than 60 seconds if terminated early by the participant). Then I can run some stats on the averaged emotion sources compared to neutral, collapsed across time.

I've tried dSPM, sLORETA and minimum norm source localisation, and each time, when I try to average the source files, I get the following error message, saying that I'm out of memory:

This is when I am only trying to average a few files at once, as you can see from the image.

I get the same error when I try to average one or more source files across time as well.

Similarly I get the same error when I tried to perform a baseline normalisation on the minimum norm sources, where I set the baseline to the neutral state (using process 2).

I have also tried to average the source data obtained from averaged EEG trials across time, but I got the same error again.

I'm running Brainstorm on Matlab Runtime R2015B on a fast, almost new Dell laptop running Windows 7 professional. The version of Brainstorm that I'm using was updated on 29th November this year.

Please could you give me some insight as to why it is running out of memory, and how I might get around this problem? I'm not sure if it's a bug or a problem with my approach.

Many thanks

Luli

Hello again,

I don’t think my problem is a bug, since I am able to average source files across time with the data from the Deep Brain Activity tutorial (http://neuroimage.usc.edu/brainstorm/Tutorials/DeepAtlas).

So it’s something to do with my data that’s stopping the averaging of source files, either across trials or across time, by requiring too much memory (my 8GB RAM maxes out with my data, but it is fine with the tutorial data). This was even when I was even trying to average only three source trials, or one file across time for a 10 second period. Do you know what this might be?

I am able to average the EEG data across files and time, and then calculate the sources from this file. Is this a valid approach?

Many thanks again

Luli

Hello,

First, I don’t think you’ll be able to get anything interesting by averaging epochs of 60s together: after a few seconds, the signals are not locked to you stimuli anymore, and there is very little chance that it would average across trials brain signals that are aligned in time.
Averaging over time is not a good solution either. This is detailed in other forum posts, like this one: Unusual values for statistics on Scouts: Constrained vs Unconstrained

The reason for which you get this error is independent from these questions. It occurs because you don’t have enough memory available to reconstruct the full source files in memory. The screen capture you posted shows files saved in the optimized way (imaging kernel + recordings: http://neuroimage.usc.edu/brainstorm/Tutorials/SourceEstimation#Computing_sources_for_single_trials). The sources are kept in this format for the interactive display, but any non-linear operation applied on the file will first rebuild the full source matrix (Nsources x Ntime) before. This operation crashes because you don’t have enough memory.

To check the size of the largest variable your computer can handle, go to the Brainstorm preferences (File > Edit preferences), and see what you have at the bottom of the window.
Try with shorter time windows or with less vertices in your source maps (to check the dimensions of the variables saved in a file: right-click on the files > File > View file contents)

Francois

Hi Francois

Thanks very much for your reply, it was really helpful.

Luli