= Phase-amplitude coupling = ---- ''Authors: Soheila Samiee, Thomas Donoghue'' This tutorial introduces the concept of phase-amplitude coupling (PAC) and the metrics used in Brainstorm to estimate it. Those tools are illustrated on three types of data: simulated recordings, rat intra-cranial recordings and MEG signals. <> == Phase-amplitude coupling == === Introduction === The oscillatory activity in multiple frequency bands is observed in different levels of organization from micro-scale to meso-scale and macro-scale. Studies have been shown that brain functions are achieved with simultaneous oscillations in different frequency bands [Schutter and Knyazev, 2012]. Classical studies in this field were only focused on rhythms in each of these frequency bands, and it has been reported that these rhythms are linked to perception and cognition [Cohen, 2008]. However, it is revealed that not only examining brain activity in each single frequency band, but also the relation and interaction between oscillations in different bands, can be informative in understanding brain function. Thus, this concept increasingly receives interest especially in the field of cognitive neuroscience. This interaction between several oscillations is also known as cross-frequency coupling (CFC). Two forms of recognized CFC in brain rhythms are: phase amplitude coupling (PAC), and phase-phase coupling (PPC). In the first type, which is also called nested oscillations, the phase of the lower frequency oscillation (nesting) drives the power of the coupled higher frequency oscillation (nested), that results in synchronization of amplitude envelope of faster rhythms with the phase of slower rhythms. The second form is amplitude independent phase locking between ''n'' cycles of high frequency oscillation and ''m'' cycles of low frequency one. That's why it is also called ''n:m'' phase synchrony [Palva et al., 2005]. Among these two types, phase-amplitude coupling received more interests. It has been shown that behavioral tasks can modulate the phase amplitude coupling [Voytek et al., 2010], and also it is potentially involved in sensory integration, memory process, and attentional selection [Lisman and Idiart, 1995, Lisman, 2005, Schroeder and Lakatos, 2009]. This coupling is observed in several brain regions including hippocampus, basal ganglia, and neocortex; and these observations are reported in rats, mice, sheep, and monkeys, as well as humans [Tort et al., 2010]. Following figure shows a schematic of phase amplitude coupling. In the top signal, we have the sum of a fast and slow oscillations, where the power of fast oscillation's envelope changes with the phase of the slower oscillation, which is a sample of PAC. The bottom signal shows only the fast oscillation and the variation in its power. As it is obvious from comparison of two signals, the fast rhythm's power is always maximum, at a certain phase of slower oscillation, this phase is called coupling phase. {{attachment:schematic.jpg}} Measures of cross frequency phase amplitude coupling can monitor the relationship between the activities that modulate low frequency oscillations like sensory or motor inputs, and the local cortical activities such as local computations that are correlated to amplitude of higher frequency oscillation [Canolty and Knight, 2010]. All these interesting features of this coupling, resulted in proposing several methods for its measuring. Each of these methods has certain limitations and also advantages over the others, and can be used for a particular purpose; that's why no preferred standard method has been chosen for this estimation yet [Tort et al., 2010]. One of these methods, which is implemented in brainstorm, is called Mean Vector Length (MVL), proposed by Canolty et al. (2006). This method, and the step by step instruction of using it is described in this section of the tutorial. === A measure of PAC: Mean Vector Length (Modulation Index) === Canolty et al. (2006) pointed out that a time series defined in the complex plane by A_f_A x exp(i \phi_f_p) could be used to extract a phase-amplitude coupling measure. In this formula A_f_A is the envelope of fast oscillation, and \phi_f_p is the phase of slow oscillation. Therefore, after filtering in fast and slow oscillation, and extracting the phase of slow, and the amplitude of fast rhytm; each instantaneous fast oscillation amplitude component in time is represented by the length of the complex vector, whereas the slwo oscillation phase of the same time point is represented by the vector angle (see following figure). {{attachment:mvl_step3.jpg||height="362",width="398"}} {{attachment:mvl_steps.jpg||height="177",width="189"}} At the absence of phase-amplitude coupling, the plot of the A_f_A x exp(i \phi_f_p) time series in the complex plane is characterized by a roughly uniform circular density of vector points, symmetric around zero, because the A_f_A values (averaged over cycles of slow oscillation) are approximately the same for all phases. If there is modulation of the f_A amplitude by the f_P phase, the A_f_A would be higher at certain phases than others. This higher amplitude for certain angles will lead to a “bump” in the polar plot of the A_f_A x exp(i \phi_f_p), leading to loss of symmetry around zero. This loss of symmetry can be inferred by measuring the length of the vector obtained from the mean over all points in the complex plane. It is thus assumed that a symmetric distribution as it occurs during lack of coupling leads to a small mean vector length (because the points in the different phases would cancel each other), whereas the existence of coupling leads to a larger mean vector length [Tort et al. 2010]. For more detail read [Canolty et al, 2006]. ==== References: ==== * Canolty, R. T., Edwards, E., Dalal, S. S., Soltani, M., Nagarajan, S. S., Kirsch, H. E., Berger, M. S., Barbaro, N. M., & Knight, R. T. (2006).[[http://www.ncbi.nlm.nih.gov/pubmed/16973878|High gamma power is phase-locked to theta oscillations in human neocortex]]. ''science'', ''313''(5793), 1626-1628. Canolty, R. T., & Knight, R. T. (2010). [[http://www.ncbi.nlm.nih.gov/pubmed/20932795|The functional role of cross-frequency coupling]]. ''Trends in cognitive sciences'', ''14''(11), 506-515.Cohen, M. X. (2008). [[http://www.ncbi.nlm.nih.gov/pubmed/18061683|Assessing transient cross-frequency coupling in EEG data]]. ''Journal of neuroscience methods'', ''168''(2), 494-499.Palva, J. M., Palva, S., & Kaila, K. (2005). [[http://www.ncbi.nlm.nih.gov/pubmed/15829648|Phase synchrony among neuronal oscillations in the human cortex]]. ''The Journal of Neuroscience'', ''25''(15), 3962-3972. Schutter, D. J., & Knyazev, G. G. (2012). [[http://www.ncbi.nlm.nih.gov/pubmed/22448078|Cross-frequency coupling of brain oscillations in studying motivation and emotion]]. ''Motivation and emotion'', ''36''(1), 46-54. Tort, A. B., Komorowski, R., Eichenbaum, H., & Kopell, N. (2010). [[http://www.ncbi.nlm.nih.gov/pubmed/20463205|Measuring phase-amplitude coupling between neuronal oscillations of different frequencies]]. ''Journal of neurophysiology'', ''104''(2), 1195-1210. Voytek, B., Canolty, R. T., Shestyuk, A., Crone, N. E., Parvizi, J., & Knight, R. T. (2010). [[http://www.ncbi.nlm.nih.gov/pubmed/21060716|Shifts in gamma phase–amplitude coupling frequency from theta to alpha over posterior cortex during visual tasks]]. ''Frontiers in human neuroscience'', ''4''. == Simulated recordings == You can generate a synthesized data containing cross-frequency phase-amplitude coupling, with your preferred parameters. The model used for data generation is a simple method introduced in [Tort et al. 2010]. In this section, we generate a dataset of synthesized signal and analyze it with available phase-amplitude coupling estimation tool in brainstorm. === Generation of simulated data === Start a new protocol, and create a new subject. Without putting any data in the "File to process" section, click run. Select the process "'''Simulate > Simulate PAC signal'''". This will generate a synthesized data with phase-amplitude coupling. You can set all the parameters in the corresponding window. {{attachment:pac_Generation2.jpg||height="414",width="344"}} In this window, first choose the subject's name, and enter the condition name. Then enter all other preferred parameters including the frequency of phase driver, and the frequency of high frequency bursts in Hz. Description of some the parameters: * Coupling phase is the phase of slow oscillation (phase driver) where we have the maximum envelope of fast oscillation (high frequency bursts). For example, if \phi=90 degree, the maximum coupling happens in the peaks of slow oscillation. * Signal to noise ratio determines the power of the noise that will be added to the pure coupled oscillations. This noise composed of two parts: a component with 1/f spectrum which simulate the background brains activity, and a white noise which represents the noise of data recording. * Coupling intensity is a value between 0 and 1, which determines how coupled the two oscillations are. After Running the function, you'll get a synthesized data file: {{attachment:pac_bst_created_synt.png||height="488",width="376"}} You can display the generated signal, with double clicking on it: {{attachment:pac_sample_synthesized.png||height="241",width="355"}} === Extracting PAC from Simulated data === To extract the phase-amplitude coupling from this signal, take following steps: * Drag the created file to process window {{attachment:pac_simul2.png||height="176",width="368"}} * Select the process '''"Frequency > Phase-amplitude coupling'''" * In the corresponding window, set the paramaters (Go to Self:Process Options for more information about the options). * SHOULD BE REMOVED IF THE ABOVE LINK WORKS (((( Time window: The time interval of the signal to be used for extarcating PAC * Nesting frequency band (low): The range of the frequencies to be considered for slow oscillation (frequency for phase) * Nested frequency band (high): The range of the frequencies to be considered for high frequency bursts (frequency for amplitude) * Row names or indices: The name of channels to be processed (for multi channel recordings -> should be ignored for synthesized data) * Processing Options: If you are not an expert in this field do not activate these two options. (These options can make the processing faster for large files, and slower for small files!) * Output options: * Save average PAC across trials: Averaging in the case that we have more than one file for estimation of PAC * Save the full PAC map: '''Activate this option''' to have access to comodulogram map (coupling intensity map) after estimating the PAC )))) {{attachment:pac_window.png}} * Run the process At this point the PAC map will be extracted from your file and a new file will be added to synthesized data file: {{attachment:pac_estimated.png||height="170",width="391"}} Double click on the file to see the extracted coupling map: {{attachment:pac_map_simul2.png||height="295",width="389"}} This coupling map, also known as comodulogram, shows the cross-frequency coupling in frequency domain. The pseudo color of the map indicates the level of coupling between each pair of low and high frequencies. The abscissa represents the frequencies analyzed as phase frequency, while the frequencies for amplitude are represented in the ordinate axis. Here, in this example, there is a bump around 6-8 Hz for the phase and 70-85 Hz for the amplitude, which contains the pair that we initially used for synthesizing the data.The parameters of the point with maximum coupling intensity in the map will be shown in the title of the figure. These parameters are the maximum pac level, and the corresponding f_A and f_p, which are 0.3, 6.14 Hz, and 80.18, respectively Hz in this case. === Important Point in Using MVL Algorithm for PAC estimation === In order to have a reliable result from this method it is required to use a data which length is at least''' ten cycles '''of the slowest oscillation in your low oscillation band. Considering this point is more important in analysis of real databases, where the noise level (and/or background brain activity) can be higher than synthesized data, and the coupling intensity can be low. Thus, if you want to examine the coupling for slow oscillations in [2, 14] Hz, it would be better to use a signal with minimum length of 10 cycles of the slowest oscillation, which would be 10 x 0.5 = 5 S. == Rat recordings (CURRENTLY BEING UPDATED - UNFINISHED) == In order to do this part of the tutorial you will need to download the hippocampus recording database ([[http://crcns.org/data-sets/hc/hc-3|hc3]]) from the [[http://crcns.org/|crcns]] website. You should first [[http://crcns.org/download|create an account]] in that website to be able to download the database. == MEG recordings == Step-by-step instructions to analyze the wMNE source signals for Phase Amplitude Coupling. In order to do this part of the tutorial you will need to get the file sample_resting.zip from the [[http://neuroimage.usc.edu/bst/download.php|Download]] page. Preparation of the anatomy, basic pre-processing and source modeling will be only mentioned briefly and will be similar to the continuous recordings tutorials found here: [[http://neuroimage.usc.edu/brainstorm/Tutorials/TutRawViewer|Continuous Recordings Tutorial]] === Step 1: Pre-processing === The basc pre-processing steps will be outlined briefly here. If you require more information than the overview provided here, detailed description and guidance for all the steps can be found in the Continuous Recordings tutorial or within the tutorials for the '12 Easy steps for Brainstorm', all of which are available from this page: [[http://neuroimage.usc.edu/brainstorm/Tutorials|Tutorials]] Before doing Phase amplitude coupling analysis we need the pre-processed files to analyze. We need to: * Start a new protocol * Import the anatomical and functional data * Do some basic pre-processing (artifact correction) * Project sources ==== First steps ==== * Create a new protocol in Brainstorm by going to file - new Protocol and call it 'PACTutorial' * Create a new subject by going to file - new Subject and label it 'PACTutorialSubj1'<
> * Select "Use individual anatomy" for default anatomy * Select "Use one channel file per condition" for default channel file ==== Anatomy ==== * Click on the anatomy tab at the top left of Brainstorm {{attachment:AnatMenu.gif}} * Right click on 'PACTutorialSubj1' and click on Import anatomy folder * Select the 'Anatomy' folder from the sample_resting folder you should have downloaded. Click 'open' when you have selected this folder. You now need to define the fiducial points * The MRI coordinates should be (+/- a few millimeters): . NAS: x=128, y=225, z=135 . LPA: x=54, y=115, z=107 . RPA: x=204, y=115, z=99 . AC: x=133, y=137, z=152 . PC: x=132, y=108, z=150 . IH: x=133, y=163, z=196 (anywhere on the midsagittal plane) ==== Functional data ==== * The sample_resting download contains two 10 minute resting state runs. We are going to use the first one which is the one labelled 'subj002_spontaneous_20111102_02_AUX.ds'. * Click on the functional data tab, the middle button, directly to the left of the anatomy button * Right click on the 'PACTutorialSubj1' and click on 'review raw file' * This creates a link so Brainstorm can read the raw file * It will give you an option to 'refine using head points' - select 'Yes' ==== Pre-Processing ==== All data should be pre-processed and checked for artifacts prior to doing analyses such as PAC (including marking bad segments, and correcting for artifacts such as eye blinks and heartbeats with SSPs). For the purposes of this tutorial, we will artifact correct with SSPs but will not go through marking out bad sections. When using your own data reviewing the raw data for bad sections and using clean data is of the utmost importance. SSPs for cardiac and eye artifact: Signal Source Projection (SSPs) are a method in brainstorm for projection away stereotyped artifacts (such as eye blinks and heartbeats) out of the functional data. * Open the 'Link to raw file'. From the SSP menu select 'Detect eye blinks'. This is to create blink events marking all the blinks so they can be projected away. * The EOG channel in this data is labelled 'EEG058' in this data file. Put this as the 'Channel name:' and click run. <
> {{attachment:blink.gif}} * Repeat the same procedure for 'Detect heartbeats' using 'EEG057' which is the channel name of the ECG in this data set. * Then use the ‘Compute SSP: Eyeblinks’ and ‘Compute SSP: Heartbeats’ to project away these artifacts from the data. Make sure to write 'MEG' in the 'Sensor Types or Names' option box if it is not already. For consistency with this tutorial use (only) the first component for each SSP. * For more information regarding dealing with artifacts and SSPs, view this tutorial: [[http://neuroimage.usc.edu/brainstorm/Tutorials/TutRawSsp|Artifact Tutorial]] '''__Regarding sin removal:__''' PAC analysis involves examining a very wide band of frequencies, often the examining the entire range of 2Hz - 150Hz or more. This band contains the frequencies contaminated by line noise, of either 50 or 60 Hz and their harmonics. Brainstorm offers tools to remove line noise from functional data. Here we will not do sin removal for time efficiency and also because it is not required for accurate PAC analysis. The PAC function looks for high frequencies occuring specifically certain phases of low signals such that the ubiquitous nature of line contamination effectively cancels it out for being identified as PAC. (Similarly, doing sin removal results in no 60 Hz anywhere, such that the function also identifies no PAC). To demonstrate that we can safely proceed without sin removal, consider the following PAC maps performed on the same time series with the only difference being line noise removal on one data set. The time series on the left is the raw signal and the time series on the right had sin removal performed (60Hz and 120Hz). <
> <
> {{attachment:SinCheck.gif}} <
> Sin removal is therefore not a mandatory pre-processing step for this kind of PAC analysis, however we do recommend that sin-removal be a consistent part of best practice pre-processing of data, as it can be a relevant factor in many other types of analysis. ==== Importing ==== Once the data is pre-processed and ready for further analysis we will now import the data into Brainstorm, project the sources and do the PAC analysis. * To import the data right click on the 'Link to raw file' and click on the first option which is 'Import in database' {{attachment:Import.gif}} * You should get the option box as above. We are going to leave all these options at default. When we click 'Import' brainstorm will now create a new file with the data imported with our SSPs applied and DC offset removed. Note: importing this long recording will create a new large file (~3 gb) and may take a couple minutes. * Due to the way Brainstorm reads raw files vs. imported files it is more computationally demanding to open the imported files (try opening the 'Link to raw file' vs. opening the imported file) which is why we did all the steps in which we need to scroll through the data with the link to the raw file.This is the most efficient way to proceed, as trying to open and scroll through imported files is very computationally inefficient. ==== Project Sources ==== The imported file should have saved as a new condition in our tree in the brainstorm database. At this point we still have the sensor data and now want to project the data into source space. We will need a head model and noise covariance matrix (as well as the imported anatomy) in order to do this. __Head Model__ * To compute the head model, right click on the newly imported file (which should be labelled 'Raw(0.00s,600.00s') and click on the compute head model. Use the overlapping spheres model and keep all of the options at their default values. __Noise covariance__: In the original zip download folder there is an empty room recording from when this data was collected. It is labelled XXXX(TO BE ADDED TO DOWNLOAD) * Right click on 'PACTutorialSubj1', click 'review raw file' and select this file. * Right click on the 'Link to raw file' of the noise recording, go to the 'noise covariance' submenu and click on 'compute from recordings'. * An option box will pop up, within which you should keep all the default values. Click okay to create the noise covariance matrix. This new file should now be available in the tree. Right click on the Noise covariance file and click 'copy to other conditions' to copy this file to all the other conditions where we need it. <
> * Further information as well as the importance and relevance of noise covariance is described here: [[http://neuroimage.usc.edu/brainstorm/Tutorials/TutNoiseCov|Noise Covariance Tutorial]] * You should now have a condition with imported data, a head model and noise covariance that looks like this: {{attachment:importedDB.gif}} * Right click on the raw file again and click 'compute sources'. Use the Minimum norm estimate (wMNE) and keep all the default settings. We are now ready to run the PAC analysis. === Step 2: Using the PAC Function - the Basics === Once you have the sources projected onto the anatomy proceed with the following instructions to use the PAC function on the source data. This PAC function in Brainstorm is not time resolved, but will analyze any given time series for any stable occurence of PAC over any time segment you give it. This can be done at the sensor or source level for and EEG or MEG data. Here we will analyze the source data, by giving the function the time series of the vertices of our projected data. ==== The Function ==== * The function for Phase Amplitude Coupling analysis is found in the frequency menu in the process selection menu. {{attachment:PACMenu.gif}} * Drag and drop the sources file (it should be labelled 'MN: MEG(Constr)') into the dropbox in the process 1 tab. Click on run, go to frequency and click on Phase Amplitude Coupling. ==== Process Options ==== Once you click on 'Phase-amplitude coupling' you should get an options box with the following options. {{attachment:PACOptions.gif}} . '''Time Window''': The time segment of the input file to be analyzed for PAC. . '''Nesting Frequency Band (low)''': The frequency band of interest for the frequency for phase (the low, nesting frequencies). * This can be a wide exploratory range (2 - 30 Hz) or a much smaller and specific range (ex. theta: 4-8Hz) '''Nested Frequency Band (high)''': The frequency band of interest for the frequencies for amplitude (the high, nested frequencies). * ''' '''This can be a wide exploratory range (ex. 40 - 250 Hz) or a smaller and specifc range (ex. low gamma: 40-80Hz) * Note: The nested frequency can only be as high as your sampling rate has the resolution to yield. '''Processing Options '''(These options can/should be left at default options unless you know how to use them) . '''Parallel processing toolbox: ''' PAC analysis of each time series (of a vertex or sensor) is independent of the PAC analysis of every other time series. This function is done with a loop and each iteration of the loop is independent of the one previously. The parallel processing toolbox uses a parfor loop in which multiple time series can be processed in parallel. . '''Use Mex files:''' Mex files are available for running this process and contribute to speeding up the computation time. . '''Number of signals to process at once: '''The file that is given to the function is processed in blocks, and this option signifies how many time series are in each block. '''Output Options''' . '''Save average PAC across trials: '''If this box is selected and multiple files have been given to the process, then the process will save the average PAC across all trials in a new file. * Be careful with this option - although it is offered, averaging over frequencies (such as is required to do this) can be problematic and results from this option should be interpreted carefully. . '''Save the full PAC maps: '''If this option is selected then the full PAC comodulogram will be saved for each time series (ie - the process saves the values for each and every frequency pairing for each and every time series). If this option is not selected then the process only save the values at the maxPAC - the frequency pairing with the most PAC coupling. * Saving the full PAC maps entails saving (for each time series) a matrix of [Number Of Time Series x 1 x High Freqs x Low Freqs]. This will very quickly create very large files (in source space it will save a matrix roughly [15000 x 1 x 39 x 12] which is a very large file. Only click this option if you do want to use / look at the maps or need information for every frequency pairing. We will first demonstrate the process by computed the PAC for a single vertice (a single time series in source space). This will allow us to examine what the PAC process does and visualize the result. * In the PAC option box, if the options are not already filled out, fill in time window as the full length of the file (0 - 599.9996) and the frequency options as the wide bands of low: 2 - 30 and high: 40 - 150. * We are going to arbitrarily compute PAC on source #224. (Vertices in the source data are labelled with numbers). Write '224' in the source indices option. * Click on run. * PAC files that are computed will be saved under the file containing the time series from which they were computed under the name 'MaxPAC'.It should look like this in brainstorm. {{attachment:224DB.gif}} * Double click on the 'MaxPAC file to open the PAC map - the comodulogram. You should open an interactive graphical representation of the data which looks like this. {{attachment:comodulogram224.gif}} * We have this map available because we selected the 'Save Full PAC maps' and can therefore now visualize all the frequencies pairings and their PAC strenghs for each time series. * The small white circle indicates the PAC pairing with the strongest coupling (the maxPAC pairing) and the results relevant to this pairing that are displayed on the top of the comodulogram. * MaxPAC: the strength of the coupling * flow: the low frequency (nesting) * fhigh: the high frequency (nested) * coupling phase: the phase of the flow at which the fhigh occurs * You can click to any frequency pairing and the relevant results will be displayed at the bottom of the comodulogram.This allows you to explore the values represented in the comodulogram. Here I clicked on PAC at around at 9 Hz nesting. We can see that the coupling strength is similar to the coupling at the maxPAC flow of 11.47. This suggests there are two clusters of PAC in this time series, of almost equal occurence. {{attachment:comodulogram224sel.gif}} <
> The full PAC comodulograms contain a lot of information, especially considering that we have this amount of information for every time series (each vertex) if we do this across all sources. You need not save the full PAC maps when doing this analysis - the MaxPAC function offers the option to save only the values at the maxPAC - at the frequency pairing with the highest coupling strength. This examines the time series for the maximally coupled pairing and then saves only the results related to that value. It is no quicker to compute, but saves much smaller files. To demonstrate this, we can re-run the same PAC analysis on the same time series, but unselect the 'Save full PAC maps' options. (You can do this if you wish, but if it is a long computation on your system you can look at the result below - it is simply another representation of the same data in the 'Full' file). Using the maxPAC in this manner saves nothing that can be visualized, and double clicking on the resultant file simply opens the file contents, which contains the 4 values of interest computed by the maxPAC function. <
> {{attachment:PACnonfull.gif}} <
> If you are unfamiliar with these tables, it is the 'File Contents' table available for every file in the Brainstorm database by right clicking - file - view file contents. It contains the datapath and name of the actual file on the computer as well as some summary information of everything in the file. __Relevant MaxPAC information in File Contents<
>__ * TF: the top line in the table contains the coupling strength(s) of the maximally coupled frequency pairing (maxPAC pair) * Options: the options struct contains all the parameters that were given to the process * sPAC: this struct contains all the information related to the PAC function (other than coupling strength saved in TF) * Nesting Freq: the low frequency at the maxPAC * Nested Freq: the high frequency at the maxPAC * PhasePAC: the phase at the maxPAC * DirectPAC: this holds all the values for all the other pairings to represent the PAC comodulograms * Empty when full maps are not saved * Contains a very large matrix when maps are saved * [Low Freqs]: contains the lowFreqs used * [High Freqs]: containts the highFreqs used This is a much more efficient way of saving and representing a small part of the data and pulling out the main mode of PAC in any time series. The caveat is (as we can see in the example map of vertex # 224) that pulling out only the strongest pair may not be particularly representative of the overall PAC in the time series. In this special case, where the difference in coupling strength between f-low of 11.47 and 8.30 is likely to be statistically insignificant it may be somewhat arbitrary which pair is picked. The vertex used here is something of an anomaly in that most of the time there is a much more obvious single pairing apparent in the PAC maps. When doing PAC analysis you should consider the relevance and importance of finding only the maxPAC pair for your hypothesis and guide your analysis accordingly. === Step 3: Verifying with Canolty Maps === Canolty maps are a type of Time Frequency decomposition that offer another way to visualize the data and serve as a complimentary tool to visualize and assess Phase-Amplitude Coupling. Currently there are no significance tests within Brainstorm that can give a measure if PAC is significant in a given time series, but the Canolty maps provide an important way to verify and corroborate the results of the PAC process. Canolty maps are a kind of time frequency decomposition in which the zero point of the map is aligned up to the trough of a low frequency of interest. The process lines up the data to a specific low frequency so as to visualize what happens in the power spectrum related to the phase of the low frequency. Specifically, it filters the data to extract the low frequency of interest, marks each trough as an 'event', extracts a time window around each 'event' and averages over all of them. The colormap of the Canolty map represents power in relation to the mean power. By representing a time frequency map in relation to a low frequency, we can visualize whether the power of any high frequencies fluctuates systematically with the phase of the low frequency (basically - we can visualize PAC). If there is PAC present, we should see quite stereotyped stripes of the power of certain high frequencies changing consistently with the phase of the low frequency. If there is no PAC there will be no discernable pattern (the map will just look like a 'mess'). Canolty maps are named after the author of the paper in which they were published, entitled 'High gamma power is phase-locked to theta oscillations in human neocortex' by Canolty & Knight which appeared in Science in 2006. ==== The Function ==== The Canolty Map's function is also found in the Frequency tab from the process functions. {{attachment:CanoltyMenu.gif}} There are two ways to use Canolty maps - you can manually input a low frequency of interest or you can give it the maxPAC file and it will take the low frequency at the maxPAC value. * Process 1 tab - Drop a file of time series into the process one tab and manually select the low-frequency of interest. * Process 2 tab - Drop a file of time series into File A and the corresponding maxPAC file into File B. This process will make the Canolty maps by finding (for each time series) the low frequency defined in the maxPAC file and use that to create the Canolty map. We will continue by doing the Process2 version to compliment our maxPAC results. * Click on the Process2 tab. In the FileA box drop the original time series (the source data file). In the FileB box drop the maxPAC file that we just created for source 224. {{attachment:224canoltyBox.gif}} * When in the Process2 format, clicking on run will only show the processes available. Canolty maps is still under the frequency tab, but the drop down menu will look a bit different. {{attachment:canolty2menu.gif}} <
>When you click on the Canolty Maps (process2) function you should get a an options box like this.<
> {{attachment:Canolty2Options.gif}} ==== Process Options ==== . '''Time Window: ''' the time segment from the input file to be used to compute the Canolty map. . '''Epoch Time: '''''' '''The length of the epochs used. . '''Source Indices:''' which time series from the given file to compute the Canolty maps for. . ''' ''' '''Number of signals to process at once: ''' This process is also done in blocks and this option allows for setting the block size (can be left at the default value). . '''Save averaged low frequency signals:''' In order to create the Canolty maps, Brainstorm filters the input time series at the low frequency of interest. This option saves that filtered signal, which can be useful for visualization. The only difference in the Process1 version of Canolty Maps is the additional required field of Nesting Frequency. In this case you can enter in any low frequency of interest with which to compute the Canolty Map(s).<
> {{attachment:Canolty1Options.gif}} * Click run and the option box for the Canolty process will pop up. In the case where the given maxPAC file does not include PAC values for all the time series in FileA (such as now, since the maxPAC only contains the PAC values for vertex #224 in FileA) Brainstorm does not automatically determine which time series it has PAC information for and this information has to be given. * In the 'Source Indices' option write '224' so that it will use the maxPAC for this vertice and compute the Canolty map. {{attachment:canolty224Options.gif}} * Canolty maps are saved under the time series file from which they come - similar to the maxPAC. However, if the lowFreq signal is saved the link to this file is saved below. In Brainstorm your database should look something like this. {{attachment:canotlyTree224.gif}} * Double click on the 'Canolty map' file to open it. You should see the following image. At the top the image the vertice number and low freq are written. {{attachment:Canolty2-224.gif}} * Here we can see that the Canolty map corroborates what was represented in the maxPAC file. With the data plotted controlling the phase of the low freq (here - 11.47Hz) we can see that the amplitude of the gamma (indicated by the colour) is patterned such that it appears to related to the phase of the low frequency, and as such is phase-amplitude coupling. * We can also see if the frequencies of the high (nested) frequencies are similar between functions. Similar to our PAC comodulogram, we can see in the Canolty map that the nested frequency is predominantly around 80 Hz, but that we see (less) PAC occuring at other frequencies, such as around 120Hz. * In the Canolty map itself there is no representation of the low frequency used. It can be useful to visualize the low frequency. The low frequency is accessble through the other link in the brainstorm database. Double click on the file named 'Canolty ERP'. This will open the time series filtered at the low frequency of interest that was used to compute the canolty map. {{attachment:CanoltyERPLabel.gif}} * It is common for the low frequency to look like this where it appears to fade away further from the 0 point. This is because of some jitter in the signal, and that the low frequency may not be exactly 11.47 Hz, so a filter centered at zero extracts less of signal further away. {{attachment:C-ERP-224.gif}} * By arranging the Canolty map and Low Frequency file you can get a sense of how the low frequency oscillation and high frequency amplitude relate to each. The time selection on the two is synchronized, so try clicking at particular parts of the low frequency file and examining the power in the Canolty map. You should notice gamma amplitude is low at the peak and trough of the low frequency oscillation and high between them. * There are other more quantitative ways of verifying the phase of the coupling (such as by using the phase value extracted by the maxPAC function). Canolty maps can be used for visualization purposes, and as converging evidence of the coupling. {{attachment:Canolty+ERP224.gif|Canolty+ERP224.gif}} ---- /!\ '''Edit conflict - other version:''' ---- ---- /!\ '''Edit conflict - your version:''' ---- ---- /!\ '''End of edit conflict''' ---- You may remember that in the PAC comodulogram for vertice 224 the maxPAC value was at 11.47 Hz but that there was also other areas of high PAC, including an almost equal coupling intensity at 8.3 Hz. Canolty maps only portray information relevant to the low frequency used to create the map - therefore we cannot make any conclusions about PAC at low Freq = 8.3 with the Canolty map we have made with low Freq = 11.47 Hz. We will now examinethe PAC at lowFreq = 8.3 with a new Canolty map using the Process1 version. Since 8.3 Hz is not the low frequency at the maxPAC pairing in the maxPAC pair we cannot examine this by giving the maxPAC file, we must manually specify it as a low frequency of interest. * Click on the Process1 tab and drop the source time series. {{attachment:Process1canolty.gif}} * Press run, go to frequency and click on the Canolty maps process * We need to specify the vertice of interest (224) and the low frequency of interest (8.3 Hz). Fill out the option box as follows: {{attachment:Canolty1-8.3Options.gif}} * Press run * Open the resultant file and you should see the following Canolty map * Again, we can see that when filtered based on the low frequency of 8.3 Hz the high frequency amplitude changes in a consistent manner in relation to the phase of the low frequency, supporting that there is indeed PAC at a low-freq of 8.3 Hz. {{attachment:canolty-224-8.3.gif}} <
> * We can also visualize the relation by using the low frequency filtered signal that we saved again.<
> {{attachment:Canotly224-8.3+ERP.gif|Canotly224-8.3+ERP.gif}} <
> An alternative use of Canolty maps is to verify that in the case where the PAC function indicates very low levels of Phase-amplitude coupling, that the Canolty map function also corroborates this. * Open the PAC map for vertice #224. Now we want to find a lowFreq where the PAC function did not indicate much coupling.<
> {{attachment:224-3.57.gif}} * Take the lowFreq of 3.57. * Now use the lowFreq value of 3.57 to create a Canolty map, in the exact same way we did with the low Freq of 8.30, changing only this value in the parameters. * You should get a Canolty map that looks like this {{attachment:224C-3.57.gif}} * Here we can see that the Canolty map displays nothing that looks like consistent coupling between the low frequency of 3.57 Hz and any high frequencies. This is what we expected based on the PAC comodulogram. You should notice that the Process1 version of canolty maps can be done on any time series without ever doing the exhaustive PAC process. However, since Canolty maps only use one low frequency of interest, this is not a very efficient approach (unless you have a specific frequency of interest, such as in a frequency tagging paradigm). === Step 4: 'Advanced' PAC analysis === By now you should have a pretty good idea of how to use the PAC process, what it gives out and how to check these results with the complementary Canolty maps process. The 'advanced' aspect is not a question of increased difficulty but simply of increased scale. We have been working with a single time series here. It is likely that PAC analysis you perform will want to look at much larger sets of data. This basically comes down to filling in the 'Source indices' option for the PAC process. Option for this: * Empty: will perform PAC analysis on all the time series in the file (all sensors or all sources) * Specify a subset - you can specific single time series or subsets * Recall that Brainstorm will evaluate what you write in 'Source indices'. This means you can write a single vertex (ex - '224') or a list (ex - '224, 225') and/or something to be evaluated by matlab before being handed to the PAC function (ex. '224:230') PAC analysis is a very computationally demanding process. Options for reducing computation time include: * Downsample data * You can downsample temporally (downsample to a lower sampling rate) or spatially (downsample anatomy to a smaller number of vertices) * Use shorter time segments * Use ROIs or scouts to use a smaller number of time series When you run a file with multiple time series and open it (with full PAC maps) it will open the map of the first time series, in the same way as if you only had one. In the Brainstorm window there is a 'Selected data' option to go to any time series of interest. You can also scroll through the maps using the up and down arrows on the keyboard. {{attachment:PACSelectData.gif}} The same things all apply for using the Canolty process. Experiment as you want using the PAC function with inputs of multiple time series. {{None||class="UMSRatingIcon",id="ums_img_tooltip"}}