The core of my issue is that the values that are displayed in BCI2000 in the "visualize brain signals" window are unreasonably huge. These values also appear to be what gets propagated through the signalProcessing module and to the application module. The values that get stored on disk are fine, only the visualized/real-time processed values have this extra gain factor that I cannot account for.
Okay, the story:
For testing, I am using a sensor that I will want to acquire its data (in an experiment) along with the neural signals. In my testing setup I am only acquiring the sensor, no neural signals. According to the channel properties dialog in Central (Blackrock's configuration utility), this channel should have an input dynamic range of +/- 5000 mV (obviously that's +/- 5V, but Central displays it in mV). The input digital range of the quantizer is 16 bit, or +/- 32767. My sensor has a baseline offset of about 75 mV. I've confirmed that with an oscilloscope, and that's what the display in Central indicates as well. If I record a short BCI2000 data file and then load it up in MATLAB, the baseline occurs at around a value of 500. The SourchChGain for that channel is 0.1526 mV/(AD unit), which is 10000mV/(65535 AD units). Then, 500*0.1526 is about 75. So that all checks, and therefore the gain factors for recorded data are perfectly reasonable.
The problem occurs when I bring up the BCI2000 window for visualizing the raw signal. At the default scale (the one that comes up before I make any changes), the trace is off the top of the window. If I right-click in the window and click on "show Unit", it comes up as 200 nV. If I make the necessary manipulations to actually see the signal, either by manually adjusting the SourceMin and SourceMax parameters in the Visualize tab, or by right-clicking in the visualization window and clicking 'auto-scale', I find that the baseline value of the sensor trace is reading as just under 500,000,000. Since I know the actual value (in AD units) is around 500, I figure this means that the visualizer is just assuming this is in Volts rather than AD units, and displaying the signal in units of uV (microVolts). So my question is, how do I get the visualizer to see the real value, and understand it correctly? The signal source module is correctly recording the data to disk, as I mentioned above. The problem is (I think) occurring in the messages being sent between the signal source module and the operator. At least, from what I've read in the documentation, it is the operator that is responsible for displaying the real-time signals.
One possible clue is the data types. I've used BCI2000 before with 2 other source modules, and neither of them had this problem of unreasonable gain in the visualized signals. When I went back and checked the code for those modules, I notice that they acquired the data in float32, which seems to be the default precision with which messages are exchanged back and forth between the different modules in BCI2000. Whereas the Blackrock module works with int16. That's actually appropriate, since central records in int16, and there is code in the signal source module that seems to my eye to account for the signal type correctly:
Code: Select all
int numberOfChannels = Parameter( "SourceCh" );
int samplesPerBlock = Parameter( "SampleBlockSize" );
SignalType sigType = SignalType::int16;
Output = SignalProperties( numberOfChannels, samplesPerBlock, sigType );
These are the final 4 lines of the Preflight function. The BCI2000 documentation makes it seem like int16 is a valid data type to use, but maybe it's somehow messing things up here?
If I only had to record data for offline analysis, this issue wouldn't matter that much to me. Unfortunately, the version of the signal with this huge gain is what gets propagated through the signal processing module and to the application module. I want to do closed-loop control with this system, so the online version of the signal is important.
I've gone through the code for the Blackrock module, but any modifications I make only seem to affect the data that gets recorded, not the data that gets processed online. I imagine that I might need to investigate CoreModule.cpp or something similar, but I'm not sure where to begin. Can anyone offer any advice?
Thanks