OpenMEEG error #137


In an effort to quantify errors in forward models (mostly spherical, but I might also look at Nolte's 2003 method), I'm trying to produce an accurate single compartment BEM with a 1 mm edge inner-skull mesh (40k vertices), such that the edge length is of the same order as the minimum distance of sources to the surface, which as I understand is required for accurate modeling of these superficial sources. Unfortunately, OpenMEEG crashes. My latest attempt was on a workstation with 48 GB RAM, and Brainstorm indicates that OpenMEEG would need about 6 GB to run.

First, can you confirm if this error (#137) is memory related? Is there anything I can do to make this work other than reducing the mesh resolution (and therefore the accuracy)? Thanks!

./om_gain -MEG "/export02/data/marcl/temp/openmeeg_hminv.mat" "/export02/data/marcl/temp/openmeeg_dsm.mat" "/export02/data/marcl/temp/openmeeg_h2mm.mat" "/export02/data/marcl/temp/openmeeg_ds2meg.mat" "/export02/data/marcl/temp/openmeeg_gain_meg.mat" 2>&1: Killed

** Error: OpenMEEG call: om_gain -MEG
**       "/export02/data/marcl/temp/openmeeg_hminv.mat"
**       "/export02/data/marcl/temp/openmeeg_dsm.mat"
**       "/export02/data/marcl/temp/openmeeg_h2mm.mat"
**       "/export02/data/marcl/temp/openmeeg_ds2meg.mat"
**       "/export02/data/marcl/temp/openmeeg_gain_meg.mat"
** OpenMEEG error #137: 
** ./om_gain version 2.4.1 compiled at Aug 29 2018 08:19:44 using OpenMP
**  Executing using 6 threads.
** | ------ ./om_gain
** | -MEG
** | /export02/data/marcl/temp/openmeeg_hminv.mat
** | /export02/data/marcl/temp/openmeeg_dsm.mat
** | /export02/data/marcl/temp/openmeeg_h2mm.mat
** | /export02/data/marcl/temp/openmeeg_ds2meg.mat
** | /export02/data/marcl/temp/openmeeg_gain_meg.mat
** | -----------------------
** /bin/bash: line 1: 26801 Killed                  ./om_gain -MEG "/export02/data/marcl/temp/openmeeg_hminv.mat" "/export02/data/marcl/temp/openmeeg_dsm.mat" "/export02/data/marcl/temp/openmeeg_h2mm.mat" "/export02/data/marcl/temp/openmeeg_ds2meg.mat" "/export02/data/marcl/temp/openmeeg_gain_meg.mat" 2>&1
** For help with OpenMEEG errors, please refer to the online tutorial:

Edit: Indeed it used much more memory than Brainstorm estimated (can the estimate be improved?):
[Thu Apr 25 05:23:23 2019] Killed process 26801 (om_gain) total-vm:45331880kB, anon-rss:44350760kB, file-rss:1828kB, shmem-rss:0kB

Any suggestions?

More memory? :grinning:

Yeah... :slight_smile: but it's not clear how much more I need.
The openmeeg log shows that a bunch of om_assemble steps actually completed. I'm hoping there's a way to run om_gain that would use less memory (fewer sources at a time maybe?), or at least is there a way to estimate how much memory it needs?

Yeah. In principle you're absolutely right. Every program experiments a theoretical compromise between memory used and execution time. Unfortunately I don't know (yet) that much about openMEEG implementation. Sorry.


Indeed, the estimation of the memory needed for the computation is wrong. OpenMEEG requires a lot more RAM than what is claimed here:

@Alexandre @papadop @mclerc @sik
Could you provide an estimate of the RAM and disk space need for each of the steps we need here?
It would prevent a lot of the memory errors that people are running into.


Line 182 seems only to count when the source space is a grid. We're probably missing something like:

if P == 0
    sTess = in_tess_bst(OPTIONS.CortexFile);
    P = size(sTess.Vertices, 1);

With this, the estimates for the head geometry (est_HM, est_HMINV, est_DSM) roughly match the file sizes. But the RAM estimate for GainMEG is still short: for my case it jumped from 6GB to 21GB, but it's still missing something since it crashed at 45GB.


sorry for the slow reaction time. Could you share the openmeeg files
that produced the 45GB allocations?

It's possible the last openmeeg release does something naive for the MEG gain
matrix computation.

If you have the energy to compare with previous OpenMEEG release (2.2) let me know. Otherwise we'll have a look when we have the files.