Envelope Correlation N x N (2020) - Dimensions of arrays being concatenated are not consistent

Thank you for the detailed explanation.

I computed the Envelope Correlation (2020) process on a concatenated time window of 379.92 seconds for 68 scouts by selecting 'Envelope correlation (orthogonalized)', 'Time resolution as Dynamic', 'Estimation window length of 20000 ms' with overlap of 50%, unchecked the parallel processing toolbox, and 'average among output files' as outcome options.

Reading the discussion, I edited the bst_connectivity.m considering the absolute value in both the lines mentioned (for me line 525 and line 536) to see if I was able to reproduce the same results of the HENV process with the old AEC one. Unfortunately, I obtained two different results (figures attached). The upper figure is the HENV process and the bottom one is the old AEC one with the edited absolute value in the code. Can it be because the old AEC process considers averaged measures across the time window? In the results obtained from the HENV process, I kept the 'dynamic' time resolution as suggested in the tutorials and from my understanding in this discussion.

For both processes, I applied the orthogonalization and studied the following frequency bands: delta / 1, 4 / mean, theta / 4, 8 / mean, alpha / 8 12 / mean, beta / 12, 30 / mean, gamma / 30, 70 / mean, broadband / 1, 70 / mean.

Thank you in advance.

Thank you for your report.

In line with the present thread, we are actively looking into options to consolidate the current code. Please stand by; we'll get back to you with a revised version of the Brainstorm toolkit for AEC measures over the next couple of weeks.

1 Like

Perfect, thank you. Look forward to hearing from you.

Hi Sylvain!

Just was wanted to ask if there was any update regarding the AEC measures?

Thank you,

Not yet, and probably not before the end of the summer.

Hi Francois/BST Team,

I just wanted to follow up and ask if there was any update regarding the Amplitude Envelope Correlation calculations?


Sorry for the long delay. We'll be looking into this issue as soon as possible, but we need to prioritize our limited resources.