Hello
I've run an EEG experiment where I elicited different emotions for epochs of up to 60 seconds. I'm comparing different emotion states to a neutral state.
I'd like to run a source localisation - both on the frequency bands that are elicited in the different emotions, compared to the neutral state, as well as a general source localisation that isn't restricted to the specific frequency bands, in an attempt to localise activation of specific brain regions during specific emotions e.g. the amygdala during fear, or the insula during disgust.
I've imported the epochs, computed a mixed BEM head model, including the subcortical regions the amygdala, hippocampus and caudate, and computed sources. I'm using the default anatomy for all of my subjects.
My plan for the source localisation analysis for each participant is to average the source files of the trials for each of the emotions (7 emotions and 1 neutral state, with 3-4 trials for each emotion), and then average these across time for the duration of the emotion epoch (sometimes these were less than 60 seconds if terminated early by the participant). Then I can run some stats on the averaged emotion sources compared to neutral, collapsed across time.
I've tried dSPM, sLORETA and minimum norm source localisation, and each time, when I try to average the source files, I get the following error message, saying that I'm out of memory:
This is when I am only trying to average a few files at once, as you can see from the image.
I get the same error when I try to average one or more source files across time as well.
Similarly I get the same error when I tried to perform a baseline normalisation on the minimum norm sources, where I set the baseline to the neutral state (using process 2).
I have also tried to average the source data obtained from averaged EEG trials across time, but I got the same error again.
I'm running Brainstorm on Matlab Runtime R2015B on a fast, almost new Dell laptop running Windows 7 professional. The version of Brainstorm that I'm using was updated on 29th November this year.
Please could you give me some insight as to why it is running out of memory, and how I might get around this problem? I'm not sure if it's a bug or a problem with my approach.
Many thanks
Luli