Filtering of long recording

it seems that changing the way i import the data from continous file to recording in the db if fixing the issue:

The first version links the file to the database as a continuous file ( process_import_data_raw).
The second one makes a full copy of the file into the database as an "epoch" (import_data).
This is not surprising that the behavior is different.

However, the created edf file doesn't correspond to the data in brainstorm and contains data discontinuty:

This could be due to the length of the EDF pages to be incorrectly set.
But I could not reproduce this behavior with this sequence: linking a BrainVison .eeg, resampling, exporting as EDF.

Could you try to create a minimal reproducible example?

  • Find one short file with which you can reproduce the error
  • Write the minimal script to reproduce the error from this file (the notch filter is probably not necessary)
  • Upload and post here

Thanks