Importing Data for Multiple Subjects Simultaneously

Hi Francois,

Here’s the Matlab .m script, and I’ll describe below what I’m trying to do/what’s wrong:

% Input files
sFiles = {…
‘75/@raw75_MS_05_12_15_EEG_REST_bandpass_notch/data_0raw_75_MS_05_12_15_EEG_REST_bandpass_notch.mat’, ‘72/@raw72_NM_04_29_15_EEG_REST_bandpass_notch/data_0raw_72_NM_04_29_15_EEG_REST_bandpass_notch.mat’};
SubjectNames = {…
‘75’, ‘72’};

% Start a new report
bst_report(‘Start’, sFiles);

% Process: Import MEG/EEG: Time
sFiles = bst_process(‘CallProcess’, ‘process_import_data_time’, …
sFiles, [], …
‘subjectname’, SubjectNames{1}, …
‘condition’, ‘’, …
‘timewindow’, [], …
‘split’, 0, …
‘usectfcomp’, 1, …
‘usessp’, 1, …
‘freq’, [], …
‘baseline’, []);

So I’m trying to import resting EEG data for participant 75 and participant 72, and then use the “SubjectNames” input to assign each newly imported data to their respective participant files. However, when I run the imported files for both participant 75 and 72 they get assigned to the participant node 75 in my brainstorm data set.

Under %Process: Import MEG/EEG: Time, I’ve also tried setting ‘subjectname’, SubjectNames{1},… to ‘subjectname’, SubjectNames{1,2},… but then it sets my imported files to participant 72’s node (the second option).

I could ultimately set up a for-loop to get around this, but I wanted to pose the question here in case A) I’m making a mistake or B) there’s a small bug in the program.

Thanks in advance for any help!
Anthony

Hi Anthony,
You will need to put this in a loop. The inputs for this process include the file you want to import from and the subject where you want to put the imported data. You will need to index both the sFiles and the SubjectNames.

for iFile = 1:length(sFiles)
% Process: Import MEG/EEG: Time
sFiles = bst_process(‘CallProcess’, ‘process_import_data_time’, …
sFiles{iFile}, [], …
‘subjectname’, SubjectNames{iFile}, …
‘condition’, ‘’, …
‘timewindow’, [], …
‘split’, 0, …
‘usectfcomp’, 1, …
‘usessp’, 1, …
‘freq’, [], …
‘baseline’, []);

Beth

Okay thanks Beth!