Hi,
My situation is the following : I need to import lots (several thousands per run) of small epochs, and it takes weeks. I tried to split them up on multiple matlabs (running a separate subject each) but some of the epochs are skipped (i.e. I do not have the same number of epochs as I do events once the process is run). I counted the files inside the appropriate folder within /data and there are not the right number of .mat files.
I'm maybe suspecting some interaction with the protocol.mat file being accessed simultaneously, I have now turned off the db_save function as said in this brainstorm discussion (Processes slowing way down...?), but have not yet tried to rerun this and are not sure if this is a good way of doing it.
Because some runs do not have instances of all the events, if I set it up to do a bunch of runs at once the process crashes. I therefore have to run each run manually with only the subset of epochs specified that it contains. This makes it very hard to set it up to run overnight or sequentially on its own.
Question: What procedure would be recommended for reading in lots of epochs into different runs that contain different event types, without corrupting the protocol.m file or missing epochs, and given that saving/reading in a protocol file that contains many epochs gets very very slow.
Thanks in advance for your understanding.
Hugo