Problem with time dynamic envelope correlation

Hi everyone, I have a problem with the envelope correlation (EC). I want to compute the dynamic EC of a source data, using the Destrieux scouts. I choose the scout function= mean, apply the scout function = before, so that the matrix won't be 15002x15002xtimexfreq but 148x148xtimexfreq. However, nontheless I selected the previous options, it keeps continuing to compute the matrix 15002x15002xtimexfreq. In fact I receive the error message that the matrix is too big. When I compute EC at the sensor level, it works perfectly. It seems that the function on the scouts is not working correctly. But I'm not sure if I'm doing something wrong, or if this is a small bug. Any suggestion?
thanks

@gianmarcoduma I'm sorry, your bug report got forgotten for so long!
Indeed, there is an error here. We'll try to fix it ASAP.

@hossein27en What were you intending to do with this replacement of TargetA with TargetB?
It replaces the subselection of the input data with an empty matrix, and therefore loses the scouts selection.

Hi Francois,

Sorry I missed the forum for a long time. So, does the problem still exist? Is it the same thing you requested before to have two process files (1 x N) and (N x N)?

So, does the problem still exist?

Use, no one edited the function since then.

Is it the same thing you requested before to have two process files (1 x N) and (N x N)?

No, this is a different problem: we need a process_henv1.m, a process_henv1n.m and a process_henv2.m and

Hi Francois,

I am not sure if its a bug but couldn't find an appropriate discussion for this so posting it here.
I am computing static (orthogonalised) AEC on source localised data (LCMV). I have 20 participants in one group and have selected 7 scouts from previous analyses (the scouts were exported from DK atlas and imported again as a separate atlas). Each trial is 60s long and there are 10 trials (for most participants).

For each participant, I want to compute AEC on these sources. I have selected "before" and "mean" as scout options and yet it is generating a 15002x15002 matrix per epoch (the analysis is only for one frequency band). This is becoming extremely time consuming (several hours for just one epoch of one participant) even on a HPC and I am also using the parallel computing toolbox.

Any help would be highly appreciated.

I have selected "before" and "mean" as scout options and yet it is generating a 15002x15002 matrix per epoch

You are doing something wrong here: if you select the options "Use scouts" and NOT "Scout function: all", the function bst_connectivity.m computes the scouts time series before running the AEC estimation.
If you are running this from a script, make sure the scouts requested really exist.

image

Thanks for quickly checking this up for me.

I was using the Envelope Correlation NxN (2020) process. But it seems to work well with AEC NXN process. Just wondering, are all the features of the 2020 process (e.g., dynamic connectivity) been rolled out?

Thanks

are all the features of the 2020 process (e.g., dynamic connectivity) been rolled out?

Most of it is working, but this is still a work in progress...

@hossein27en @Raymundo.Cassani

@Proxima The two functions (Amplitude Envelope Correlation and Envelope Correlation 2020) use the same article as the reference to compute the amplitude envelope correlation.

https://www.nature.com/articles/nn.3101

However, the 2020 function has more features including the ability to compute the frequency domain by Morlet wavelet, splitting the large data into smaller pieces, using the Parallel Processing Toolbox, calculating other connectivity measures, and employing moving average window for dynamic connectivity. We are working to prepare the A x B case for the 2020 function.

In case you use Hilbert transform, window length equals signal length, and the static case for the 2020 function, you should get identical results with the older AEC function.

Thanks for the clarification @hossein27en.

Any ideas as to why the 2020 process might be so time consuming? I was trying to expedite the process by using the parallel processing toolbox but it still is slower than the older implementation.

The process Envelope correlation NxN (2020) ignores the scout selection.
Even when you select scouts and the "before" option, it computes the full 15000x15000 matrix, which is insanely big.

This bug requires more technical discussions, let's continue it in this github issue:

The process Envelope correlation NxN (2020) ignores the scout selection.
Even when you select scouts and the "before" option, it computes the full 15000x15000 matrix, which is insanely big.

This bug has been fixed.