HI there. I am new to this field, im just starting a Phd in media arts technology , i am interested in being able to create animations through Brain Computer Interaction. Today i came across a commercial eeg reader available for the gaming and sdk market by Emotiv epoc by Emotiv Systems
( please see http://www.emotiv.com/index.php for more information) . I am wondering whether brainstorm would be able to take live feed data from such a device. Look forward to your reply.
Hi Tajinder,
First, thanks for your interest in our software.
Brainstorm is for now mostly a research software, used in hospitals and major imaging centers. Because of that, we mostly provide support for clinical EEG acquisition systems.
But with the development of those cheap and easy to use EEG devices, I can imagine that the need for open-source tools like Brainstorm will grow rapidly in the next few years.
Brainstorm does not provide any real-time support for now. You could use it to create intermediate results offline (to reconstruct brain sources or to filter your data) and use those results in a real-time application, as some other people do. But I guess it would be too much development at your level, as real-time BCI might not be your primary interest.
I’m sorry I might not be able to provide a lot of help to you right now.
However, both real-time and general audience EEG systems are interesting for us as future developments. So if there are small adaptations to do to Brainstorm that could help you, such as adding the support for Emotiv EPOC file format, this is something that we could work on together.
the Emotiv helmet has a lot of potential in generative art and visuals. I have already used it for a digital installation, and to control parameters of generative contents during VJ live act. Both time, I used directly the SDK embedded in OpenFrameworks (C++) and OSC. I have tried to link it realtime with Matlab but it is quite difficult since their SDK is a really closed propriotary .dll
Apparently Fieldtrip support it so it’s possible: http://fieldtrip.fcdonders.nl/development/realtime/emotiv
You can also have a look on http://openvibe.inria.fr/ which has a specific driver.
we actually used brainstorm functions to feature real-time source reconstruction in a recent publication. That was with MEG but the principles hold for EEG as well, as long as you can capture buffers of data in real-time. The approach is described in the article (it’s also based on a real-time server feature from field trip) but please send us questions through this thread and I would be happy to provide details.
Thanks Guillaum, i will have a look into those links. how well did the control of the vj work out.
Best Tajinder
[QUOTE=gdumas;3130]Hi Tajinder,
the Emotiv helmet has a lot of potential in generative art and visuals. I have already used it for a digital installation, and to control parameters of generative contents during VJ live act. Both time, I used directly the SDK embedded in OpenFrameworks (C++) and OSC. I have tried to link it realtime with Matlab but it is quite difficult since their SDK is a really closed propriotary .dll
Apparently Fieldtrip support it so it’s possible: http://fieldtrip.fcdonders.nl/development/realtime/emotiv
You can also have a look on http://openvibe.inria.fr/ which has a specific driver.
Thanks Sylvain. As i get deeper into this topic im sure questions will arise, so i may take up your offer of answering questions at a later time. Thanks Taj
Thanks for the reply Francois,… i think your right at this stage it may be too much for me too develop, in the meantime fingers crossed that someone who
has the skills will. Thanks taj
the control was less responsive than an usual MIDI interface since the Emotiv neuromarkers are computed over long time constant (3-4s). However, it works quite nice for global parameters in both filters or generative content. If you seek really fast response, I will advice to use the Wiimote or the Kinect.