Hello, I have this continuous.dat file that I want to analyze. I made a new protocol and new subject then load the file using Review raw file. However, when I tried to apply the Notch filter, it is giving me this error in matlab 'ERROR: File is truncated(11085309 values were read instead of 11085312)', and error report from brainstorm 'Error using out_fopen_bst. Could not open output file'. I think it happens only with the .dat file because I had no problems when I preprocess the .mat files. Will there be any reason why the file is truncated?
Where is this .dat file coming from (device, software)?
Was there any pre-processing done on it?
Do you experience the same error with other .dat files with the same origin?
Could it be possible to share the problematic file with us for further debugging?
We record the data using 'open ephys' software and no pre-processing was done. For the other dat file, brainstorm doesn't show me an error report, but I am getting this 'ERROR: File is truncated' from matlab repeatedly many times.
I just tried to upload the file however, it says the file is too large and asks me to upload the large file to a cloud sharing service. Would you mind telling me how I can do that?
After downloading the folder, the path to the data is POST_M1_M22024-02-25_12-40-44/Record Node 115/experiment1/recording1/continuous/Acquisition_Board-108.Rhythm Data-B
Yes! One thing I want to let you know is that for the Open Ephys format file(.continuous format), we made a code that converts to .mat file and use that .mat file in the brainstorm. However, for the Binary format file (.dat format), we do not convert those files since brainstorm can read the binary format.
The recording I just share (that gives no error when applying notch filter) is the mat file which we converted the Open Ephys format file manually with our code.
Hi @hannah0819, thank you for sharing the flat binary format.
We have updated the support for Open Ephys continuous binary files in commit: c264adf
Update your Brainstorm instance to be able to review and process the file. Please let us know how this update works for you. In addition, you can use your .dat files directly in Brainstorm, without having to convert them.
These are the details: the trouble lied in the fact that after version 0.6.0 of the Open Ephys GUI software the file timestamps.npy changed the data it holds. In previous versions (as the one implemented in Brainstorm) it contained the sample indices. However after version 0.6.0, it now contains the timestamps (in seconds) for the samples, and the sample indices are in the sample_numbers.npy. You do not have the worry about this change, as Brainstorm can now review properly the old and new files.
Hello @Raymundo.Cassani ! I am having another problem importing raw data(from 2020) which is in Binary format(.dat). I used 'review raw file' to import the data set, however, it seems like it is reading the time incorrectly.
The data set is about 4500 seconds long with 2000Hz sampling rate, however, when I import the Binary format file, it is showing that the data is about 300seconds long with 30000Hz sampling rate.
The data is recorded in both dat format and continuous format back in 2020. However, the continuous format file is converted to mat format and when I import this file, the time range seems correct here. I uploaded both the dat file and the mat file(that is converted from the continuous format file). Would you please take a look?
The file structure.oebin contains the information of the recoding, and it does indicate 30,000Hz for sampling frequency. That is why this sampling frequency is used when reviewed in Brainstorm
Not sure, how this is fixed, since the .mat file only contains the recording values in a matrix of shape [nChannels, nSamples] and there is not information of the sampling frequency.
If you are sure that the recording was performed at 2000Hz, you could modify the structure.oebin to indicate that sampling rate. You may also want to verify with the OpenEphys team if there was a bug back then (2020) that did not save the correct sampling frequency.