Page 1 of 1

signal control and cursor task

Posted: 05 Aug 2015, 07:18
by joanllo
Hi guys,
I’m currently starting my PhD project based on EEG signal processing and neurofeedback. I’ve been following BCI2000 manual and wiki since I’m performing a motor imagery paradigm (right and left hand) to train mu rhythm (with BrainAmp MR). I think I understand well the screening session utility and the pipeline to configure online feedback session through the different filters (spatial, spectral estimator, classifier, normalizer…). But I have a few questions:
1. How signal features set (i.e. power on 12 Hz for two spatial filtered channels C3_out and C4_out) match with actual cursor movement? I mean, I understand that the output of Normalizer (power) is the control signal, but the cursor moves in one direction when the power arises and in the other direction when it downs? Comparing the power of two channels or the two conditions (left or right MI)? Or the cursor moves following changes on power in the time course?
2. I think that the only way to define power threshold (max or min) which subject must achieve to provide positive feedback is indirectly: depending on the coordinates location of the targets. The large distance between the cursor and the target, the more change of power needed to hit. But, there’s some way to establish a threshold value?

Sorry for the inconvenience I’m really new on feedback software.

Thank you.

Re: signal control and cursor task

Posted: 05 Aug 2015, 14:18
by pbrunner
Joanllo,

for neurofeedback training I typically recommend the basket task. In this task, the cursor starts in the center of the screen and the subject's task is to modulate his/her brain signals to move the cursor up or down. The combination of covert task, scalp location and frequency is typically establish in a screening session. In this, you would have the subject perform several over and covert tasks (e.g., open/close the hand). You then analyze the data using the BCI2000 OfflineAnalysis tool. Next you would configure BCI2000 for the real-time neurofeedback training:

1) Transfer all recorded channels (i.e., TransmitChList)
2) SpatialFilter to calculate laplacian filters over relevant scalp locations (e.g., C3, Cz, C4)
3) ARFilter to estimate the spectral power of these laplacian filter outputs
4) LinearClassifier to select the scalp location and frequency combination(s)
5) LPFilter to smooth the selected signal
6) Normalizer to create a control signal (i.e., the cursor velocity) that has a zero mean and a std of one (i.e., the cursor velocity in pixel per block size).
7) CursorTask to translate the velocity into cursor movement

Now to your specific questions:
@1: The cursor moves as a function of the cursor velocity that is calculated by the normalizer stage. This velocity is in pixel per BCI2000 block. The BCI2000 block size is a function of the sampling rate and the sample block size, both of which you can configure in the Source tab of BCI2000.
@2: To overcome this limitation I would recommend that you stick with the basket task, as this task give the subject as much time as needed to learn how to modulate his/her brain signals. This has advantages over a so called right-justified-box-task in which the cursor also moves with a constant speed towards two targets and the subject has the time pressure to control the brain signals within a certain time period.

I have attached the following BCI2000 parameter fragment that configures a box task. For more details please see the PDF file [1] that details the configuration of this task.

Code: Select all

Filtering:SpatialFilter int SpatialFilterType= 1 2 0 3 // spatial filter type 0: none, 1: full matrix, 2: sparse matrix, 3: common average reference (CAR) (enumeration)
Filtering:SpatialFilter:SpatialFilter matrix SpatialFilter= { C3_OUT Cz_OUT C4_OUT } { F3 F4 T7 C3 Cz C4 T8 Pz } -0.25 0.00 -0.25 1.00 -0.25 0.00 0.00 -0.25 -0.20 -0.20 0.00 -0.20 1.00 -0.20 0.00 -0.20 0.00 -0.25 0.00 0.00 -0.25 1.00 -0.25 -0.25 0 % % // columns represent input channels, rows represent output channels
Filtering:SpatialFilter:SpatialFilter intlist SpatialFilterCAROutput= 0 // when using CAR filter type: list of output channels, or empty for all channels
Filtering:SpatialFilter:SpatialFilter int SpatialFilterMissingChannels= 1 0 0 1 // how to handle missing channels 0: ignore, 1: report error (enumeration)
Filtering:Spectral%20Estimation:ARThread float FirstBinCenter= 0Hz 0Hz % % // Center of first frequency bin (in Hz)
Filtering:Spectral%20Estimation:ARThread float LastBinCenter= 30Hz 30Hz % % // Center of last frequency bin (in Hz)
Filtering:Spectral%20Estimation:ARThread float BinWidth= 6Hz 3Hz % % // Width of spectral bins (in Hz)
Filtering:Spectral%20Estimation:ARThread int OutputType= 0 0 0 2 // 0: Spectral Amplitude, 1: Spectral Power, 2: Coefficients (enumeration)
Filtering:AR%20Spectral%20Estimator:ARThread int ModelOrder= 20 16 0 % // AR model order
Filtering:AR%20Spectral%20Estimator:ARThread int EvaluationsPerBin= 15 15 1 % // Number of uniformly spaced evaluation points entering into a single bin value
Filtering:Spectral%20Estimation:SpectralEstimatorChoice int SpectralEstimator= 1 1 0 2 // Choice of spectral estimation algorithm, 0: None, 1: AR, 2: FFT (enumeration)
Filtering:Windowing:WindowingThread int WindowLength= 0.5s 0.5s % % // Length of window
Filtering:Windowing:WindowingThread int Detrend= 1 0 0 2 // Detrend data? 0: no, 1: mean, 2: linear (enumeration)
Filtering:Windowing:WindowingThread int WindowFunction= 0 0 0 3 // Window function 0: Rectangular, 1: Hamming, 2: Hann, 3: Blackman (enumeration)
Filtering:LinearClassifier matrix Classifier= 1 { input%20channel input%20element%20(bin) output%20channel weight } 2 12Hz 2 1 // Linear classification matrix in sparse representation
Filtering:LPFilter float LPTimeConstant= 1s 16s 0 % // time constant for the low pass filter
Filtering:ExpressionFilter string StartRunExpression= % // expression executed on StartRun
Filtering:ExpressionFilter string StopRunExpression= % // expression executed on StopRun
Filtering:ExpressionFilter matrix Expressions= 0 1 // expressions used to compute the output of the ExpressionFilter (rows are channels; empty matrix for none)
Filtering:Normalizer floatlist NormalizerOffsets= 2 0 10.9104 0 % % // normalizer offsets
Filtering:Normalizer floatlist NormalizerGains= 2 0 0.538485 0 % % // normalizer gain values
Filtering:Normalizer intlist Adaptation= 2 0 2 0 0 2 // 0: no adaptation, 1: zero mean, 2: zero mean, unit variance (enumeration)
Filtering:Normalizer matrix BufferConditions= 2 2 0 (Feedback)&&(TargetCode==1) 0 (Feedback)&&(TargetCode==2) // expressions corresponding to data buffers (columns correspond to output channels, multiple rows correspond to multiple buffers)
Filtering:Normalizer float BufferLength= 30s 9s % % // time window of past data per buffer that enters into statistic
Filtering:Normalizer string UpdateTrigger= (Feedback==0) // expression to trigger offset/gain update when changing from 0 (use empty string for continuous update)
Visualize:Processing%20Stages int VisualizeSpatialFilter= 0 0 0 1 // Visualize SpatialFilter output (boolean)
Visualize:Processing%20Stages int VisualizeSpectralEstimator= 0 0 0 1 // Visualize SpectralEstimator output (boolean)
Visualize:Processing%20Stages int VisualizeLinearClassifier= 1 0 0 1 // Visualize LinearClassifier output (boolean)
Visualize:Processing%20Stages int VisualizeExpressionFilter= 0 0 0 1 // Visualize ExpressionFilter output (boolean)
Visualize:Processing%20Stages int VisualizeLPFilter= 0 0 0 1 // Visualize LPFilter output (boolean)
Visualize:Processing%20Stages int VisualizeNormalizer= 0 0 0 1 // Visualize Normalizer output (boolean)
Application:Application%20Window:ApplicationWindow int WindowWidth= 640 640 0 % // width of Application window
Application:Application%20Window:ApplicationWindow int WindowHeight= 600 480 0 % // height of Application window
Application:Application%20Window:ApplicationWindow int WindowLeft= 0 0 % % // screen coordinate of Application window's left edge
Application:Application%20Window:ApplicationWindow int WindowTop= 0 0 % % // screen coordinate of Application window's top edge
Application:Application%20Window:ApplicationWindow string WindowBackgroundColor= 0x000000 0x505050 % % // Application window background color (color)
Application:Sequencing:FeedbackTask float PreRunDuration= 2s 2s 0 % // duration of pause preceding first trial
Application:Sequencing:FeedbackTask float PreFeedbackDuration= 2s 2s 0 % // duration of target display prior to feedback
Application:Sequencing:FeedbackTask float FeedbackDuration= 10s 3s 0 % // duration of feedback
Application:Sequencing:FeedbackTask float PostFeedbackDuration= 1s 1s 0 % // duration of result display after feedback
Application:Sequencing:FeedbackTask float ITIDuration= 1s 1s 0 % // duration of inter-trial interval
Application:Sequencing:FeedbackTask float MinRunLength= 120s 120s 0 % // minimum duration of a run; if blank, NumberOfTrials is used
Application:Sequencing:FeedbackTask int NumberOfTrials= % 0 0 % // number of trials; if blank, MinRunLength is used
Application:Targets:FeedbackTask int NumberTargets= 2 2 0 255 // number of targets
Application:Targets:FeedbackTask intlist TargetSequence= 0 1 % % // fixed sequence in which targets should be presented (leave empty for random)
Application:Window:CursorFeedbackTask int RenderingQuality= 0 0 0 1 // rendering quality: 0: low, 1: high (enumeration)
Application:Sequencing:CursorFeedbackTask float MaxFeedbackDuration= 4s % 0 % // abort a trial after this amount of feedback time has expired
Application:3DEnvironment:CursorFeedbackTask floatlist CameraPos= 3 50 50 150 // camera position vector in percent coordinates of 3D area
Application:3DEnvironment:CursorFeedbackTask floatlist CameraAim= 3 50 50 50 // camera aim point in percent coordinates
Application:3DEnvironment:CursorFeedbackTask int CameraProjection= 0 0 0 2 // projection type: 0: flat, 1: wide angle perspective, 2: narrow angle perspective (enumeration)
Application:3DEnvironment:CursorFeedbackTask floatlist LightSourcePos= 3 50 50 150 // light source position in percent coordinates
Application:3DEnvironment:CursorFeedbackTask int LightSourceColor= 0xffffff // light source RGB color (color)
Application:3DEnvironment:CursorFeedbackTask int WorkspaceBoundaryColor= 0xffff00 0 % % // workspace boundary color (0xff000000 for invisible) (color)
Application:3DEnvironment:CursorFeedbackTask string WorkspaceBoundaryTexture= images\grid.bmp // path of workspace boundary texture (inputfile)
Application:Cursor:CursorFeedbackTask float CursorWidth= 10 10 0.0 % // feedback cursor width in percent of screen width
Application:Cursor:CursorFeedbackTask int CursorColorFront= 0xff0000 // cursor color when it is at the front of the workspace (color)
Application:Cursor:CursorFeedbackTask int CursorColorBack= 0xff0000 // cursor color when it is in the back of the workspace (color)
Application:Cursor:CursorFeedbackTask string CursorTexture= % // path of cursor texture (inputfile)
Application:Cursor:CursorFeedbackTask floatlist CursorPos= 3 50 50 50 // cursor starting position
Application:Targets:CursorFeedbackTask matrix Targets= 2 { pos%20x pos%20y pos%20z width%20x width%20y width%20z } 50 4 50 100 8 8 50 96 50 100 8 8 // target positions and widths in percentage coordinates
Application:Targets:CursorFeedbackTask int TargetColor= 0xff0000 // target color (color)
Application:Targets:CursorFeedbackTask string TargetTexture= % // path of target texture (inputfile)
Application:Targets:CursorFeedbackTask int TestAllTargets= 0 0 0 1 // test all targets for cursor collision? 0: test only the visible current target, 1: test all targets (enumeration)

Regards, Peter

[1] http://tinyurl.com/q83axf3

Re: signal control and cursor task

Posted: 06 Aug 2015, 07:25
by joanllo
Many thanks for your quick reply Peter. It has been very helpful.

I’ll do the basket task following your advice. After checking your script I have one question related to the feedback duration parameters:
FeedbackDuration: 10 s
MaxFeedbackDuration: 4 s
The first parameter should have more duration than the second? I think it leads to a series of aborted trials since the cursor speed is too small to hit the target in 4s.
I read the definition of FeedbackDuration parameter on wiki and I notice that’s not a hard limit on duration of trial but determines cursor speed. However, I don’t understand the mathematical relation between cursor speed, BCI2000 block size and screen update rate.

Thanks to you again,
Joanllo

Re: signal control and cursor task

Posted: 06 Aug 2015, 09:48
by pbrunner
Joanllo,

to control the maximum cursor speed in the basket task you have two avenues:

1) You adjust the FeedbackDuration and leave the adaptation to adjust mean and std of the control signal.

2) You adjust the NormalizerGain after the first couple of trials. In this case you fix the FeedbackDuration, set Adaptation=2 for the first couple of trials. You then assume that the variance in the control signal stays constant and set Adaptation=1 to continue to keep the mean at zero.

Under the assumption that the control signal may be a perfect -1 and +1 for the up/down condition the maximum cursor speed v_0 is calculated by the following equation:

v_0 = (screen width) / ( 2*FeedbackDuration);

The actual cursor speed is then product of v_0 and control signal:

v = v_0 * (control signal)

All values are in pixel per the unit in which you provide the FeedbackDuration, i.e., seconds or sample blocks. If you scale the NormalizerGain, e.g., by disabling the adaptation for the std and manually adjusting the value, this also scales v_0 and v.

In the parameter fragment that I posted above I selected avenue #2, hence the short MaxFeedbackDuration.

In general, I recommend to first test your basked task with the SignalGenerator source module in which you can modulate the amplitude of a sinus signal using your mouse [1]. This allows you to play with your filter chain and the timing settings.

Setting the Adaptation=1 after the first couple of trials removes another parameter that might confuse the human subject, as it changes. But most importantly, it allows the subject to hit the target faster over time, which serves as a reward to increased mental effort and is necessary to keep the motivation up.

Let me know if this answers your questions.

Regards, Peter

[1] http://www.bci2000.org/wiki/index.php/U ... eAmplitude

Re: signal control and cursor task

Posted: 11 Aug 2015, 05:25
by joanllo
Thank you so much Peter, all the info has been very useful. I will have all your tips in mind.
In September I will start a basket task with several subjects to improve the paradigm. I’m sure more questions will arise in the process.

Thank you again.

Joanllo

Re: signal control and cursor task

Posted: 09 Oct 2015, 01:39
by nadiamz123
pbrunner wrote:Joanllo,

for neurofeedback training I typically recommend the basket task. In this task, the cursor starts in the center of the screen and the subject's task is to modulate his/her brain signals to move the cursor up or down. The combination of covert task, scalp location and frequency is typically establish in a screening session. In this, you would have the subject perform several over and covert tasks (e.g., open/close the hand). You then analyze the data using the BCI2000 OfflineAnalysis tool. Next you would configure BCI2000 for the real-time neurofeedback training:

1) Transfer all recorded channels (i.e., TransmitChList)
2) SpatialFilter to calculate laplacian filters over relevant scalp locations (e.g., C3, Cz, C4)
3) ARFilter to estimate the spectral power of these laplacian filter outputs
4) LinearClassifier to select the scalp location and frequency combination(s)
5) LPFilter to smooth the selected signal
6) Normalizer to create a control signal (i.e., the cursor velocity) that has a zero mean and a std of one (i.e., the cursor velocity in pixel per block size).
7) CursorTask to translate the velocity into cursor movement

Now to your specific questions:
@1: The cursor moves as a function of the cursor velocity that is calculated by the normalizer stage. This velocity is in pixel per BCI2000 block. The BCI2000 block size is a function of the sampling rate and the sample block size, both of which you can configure in the Source tab of BCI2000.
@2: To overcome this limitation I would recommend that you stick with the basket task, as this task give the subject as much time as needed to learn how to modulate his/her brain signals. This has advantages over a so called right-justified-box-task in which the cursor also moves with a constant speed towards two targets and the subject has the time pressure to control the brain signals within a certain time period.

I have attached the following BCI2000 parameter fragment that configures a box task. For more details please see the PDF file [1] that details the configuration of this task.

Code: Select all

Filtering:SpatialFilter int SpatialFilterType= 1 2 0 3 // spatial filter type 0: none, 1: full matrix, 2: sparse matrix, 3: common average reference (CAR) (enumeration)
Filtering:SpatialFilter:SpatialFilter matrix SpatialFilter= { C3_OUT Cz_OUT C4_OUT } { F3 F4 T7 C3 Cz C4 T8 Pz } -0.25 0.00 -0.25 1.00 -0.25 0.00 0.00 -0.25 -0.20 -0.20 0.00 -0.20 1.00 -0.20 0.00 -0.20 0.00 -0.25 0.00 0.00 -0.25 1.00 -0.25 -0.25 0 % % // columns represent input channels, rows represent output channels
Filtering:SpatialFilter:SpatialFilter intlist SpatialFilterCAROutput= 0 // when using CAR filter type: list of output channels, or empty for all channels
Filtering:SpatialFilter:SpatialFilter int SpatialFilterMissingChannels= 1 0 0 1 // how to handle missing channels 0: ignore, 1: report error (enumeration)
Filtering:Spectral%20Estimation:ARThread float FirstBinCenter= 0Hz 0Hz % % // Center of first frequency bin (in Hz)
Filtering:Spectral%20Estimation:ARThread float LastBinCenter= 30Hz 30Hz % % // Center of last frequency bin (in Hz)
Filtering:Spectral%20Estimation:ARThread float BinWidth= 6Hz 3Hz % % // Width of spectral bins (in Hz)
Filtering:Spectral%20Estimation:ARThread int OutputType= 0 0 0 2 // 0: Spectral Amplitude, 1: Spectral Power, 2: Coefficients (enumeration)
Filtering:AR%20Spectral%20Estimator:ARThread int ModelOrder= 20 16 0 % // AR model order
Filtering:AR%20Spectral%20Estimator:ARThread int EvaluationsPerBin= 15 15 1 % // Number of uniformly spaced evaluation points entering into a single bin value
Filtering:Spectral%20Estimation:SpectralEstimatorChoice int SpectralEstimator= 1 1 0 2 // Choice of spectral estimation algorithm, 0: None, 1: AR, 2: FFT (enumeration)
Filtering:Windowing:WindowingThread int WindowLength= 0.5s 0.5s % % // Length of window
Filtering:Windowing:WindowingThread int Detrend= 1 0 0 2 // Detrend data? 0: no, 1: mean, 2: linear (enumeration)
Filtering:Windowing:WindowingThread int WindowFunction= 0 0 0 3 // Window function 0: Rectangular, 1: Hamming, 2: Hann, 3: Blackman (enumeration)
Filtering:LinearClassifier matrix Classifier= 1 { input%20channel input%20element%20(bin) output%20channel weight } 2 12Hz 2 1 // Linear classification matrix in sparse representation
Filtering:LPFilter float LPTimeConstant= 1s 16s 0 % // time constant for the low pass filter
Filtering:ExpressionFilter string StartRunExpression= % // expression executed on StartRun
Filtering:ExpressionFilter string StopRunExpression= % // expression executed on StopRun
Filtering:ExpressionFilter matrix Expressions= 0 1 // expressions used to compute the output of the ExpressionFilter (rows are channels; empty matrix for none)
Filtering:Normalizer floatlist NormalizerOffsets= 2 0 10.9104 0 % % // normalizer offsets
Filtering:Normalizer floatlist NormalizerGains= 2 0 0.538485 0 % % // normalizer gain values
Filtering:Normalizer intlist Adaptation= 2 0 2 0 0 2 // 0: no adaptation, 1: zero mean, 2: zero mean, unit variance (enumeration)
Filtering:Normalizer matrix BufferConditions= 2 2 0 (Feedback)&&(TargetCode==1) 0 (Feedback)&&(TargetCode==2) // expressions corresponding to data buffers (columns correspond to output channels, multiple rows correspond to multiple buffers)
Filtering:Normalizer float BufferLength= 30s 9s % % // time window of past data per buffer that enters into statistic
Filtering:Normalizer string UpdateTrigger= (Feedback==0) // expression to trigger offset/gain update when changing from 0 (use empty string for continuous update)
Visualize:Processing%20Stages int VisualizeSpatialFilter= 0 0 0 1 // Visualize SpatialFilter output (boolean)
Visualize:Processing%20Stages int VisualizeSpectralEstimator= 0 0 0 1 // Visualize SpectralEstimator output (boolean)
Visualize:Processing%20Stages int VisualizeLinearClassifier= 1 0 0 1 // Visualize LinearClassifier output (boolean)
Visualize:Processing%20Stages int VisualizeExpressionFilter= 0 0 0 1 // Visualize ExpressionFilter output (boolean)
Visualize:Processing%20Stages int VisualizeLPFilter= 0 0 0 1 // Visualize LPFilter output (boolean)
Visualize:Processing%20Stages int VisualizeNormalizer= 0 0 0 1 // Visualize Normalizer output (boolean)
Application:Application%20Window:ApplicationWindow int WindowWidth= 640 640 0 % // width of Application window
Application:Application%20Window:ApplicationWindow int WindowHeight= 600 480 0 % // height of Application window
Application:Application%20Window:ApplicationWindow int WindowLeft= 0 0 % % // screen coordinate of Application window's left edge
Application:Application%20Window:ApplicationWindow int WindowTop= 0 0 % % // screen coordinate of Application window's top edge
Application:Application%20Window:ApplicationWindow string WindowBackgroundColor= 0x000000 0x505050 % % // Application window background color (color)
Application:Sequencing:FeedbackTask float PreRunDuration= 2s 2s 0 % // duration of pause preceding first trial
Application:Sequencing:FeedbackTask float PreFeedbackDuration= 2s 2s 0 % // duration of target display prior to feedback
Application:Sequencing:FeedbackTask float FeedbackDuration= 10s 3s 0 % // duration of feedback
Application:Sequencing:FeedbackTask float PostFeedbackDuration= 1s 1s 0 % // duration of result display after feedback
Application:Sequencing:FeedbackTask float ITIDuration= 1s 1s 0 % // duration of inter-trial interval
Application:Sequencing:FeedbackTask float MinRunLength= 120s 120s 0 % // minimum duration of a run; if blank, NumberOfTrials is used
Application:Sequencing:FeedbackTask int NumberOfTrials= % 0 0 % // number of trials; if blank, MinRunLength is used
Application:Targets:FeedbackTask int NumberTargets= 2 2 0 255 // number of targets
Application:Targets:FeedbackTask intlist TargetSequence= 0 1 % % // fixed sequence in which targets should be presented (leave empty for random)
Application:Window:CursorFeedbackTask int RenderingQuality= 0 0 0 1 // rendering quality: 0: low, 1: high (enumeration)
Application:Sequencing:CursorFeedbackTask float MaxFeedbackDuration= 4s % 0 % // abort a trial after this amount of feedback time has expired
Application:3DEnvironment:CursorFeedbackTask floatlist CameraPos= 3 50 50 150 // camera position vector in percent coordinates of 3D area
Application:3DEnvironment:CursorFeedbackTask floatlist CameraAim= 3 50 50 50 // camera aim point in percent coordinates
Application:3DEnvironment:CursorFeedbackTask int CameraProjection= 0 0 0 2 // projection type: 0: flat, 1: wide angle perspective, 2: narrow angle perspective (enumeration)
Application:3DEnvironment:CursorFeedbackTask floatlist LightSourcePos= 3 50 50 150 // light source position in percent coordinates
Application:3DEnvironment:CursorFeedbackTask int LightSourceColor= 0xffffff // light source RGB color (color)
Application:3DEnvironment:CursorFeedbackTask int WorkspaceBoundaryColor= 0xffff00 0 % % // workspace boundary color (0xff000000 for invisible) (color)
Application:3DEnvironment:CursorFeedbackTask string WorkspaceBoundaryTexture= images\grid.bmp // path of workspace boundary texture (inputfile)
Application:Cursor:CursorFeedbackTask float CursorWidth= 10 10 0.0 % // feedback cursor width in percent of screen width
Application:Cursor:CursorFeedbackTask int CursorColorFront= 0xff0000 // cursor color when it is at the front of the workspace (color)
Application:Cursor:CursorFeedbackTask int CursorColorBack= 0xff0000 // cursor color when it is in the back of the workspace (color)
Application:Cursor:CursorFeedbackTask string CursorTexture= % // path of cursor texture (inputfile)
Application:Cursor:CursorFeedbackTask floatlist CursorPos= 3 50 50 50 // cursor starting position
Application:Targets:CursorFeedbackTask matrix Targets= 2 { pos%20x pos%20y pos%20z width%20x width%20y width%20z } 50 4 50 100 8 8 50 96 50 100 8 8 // target positions and widths in percentage coordinates
Application:Targets:CursorFeedbackTask int TargetColor= 0xff0000 // target color (color)
Application:Targets:CursorFeedbackTask string TargetTexture= % // path of target texture (inputfile)
Application:Targets:CursorFeedbackTask int TestAllTargets= 0 0 0 1 // test all targets for cursor collision? 0: test only the visible current target, 1: test all targets (enumeration)

Regards, Peter

[1] http://tinyurl.com/q83axf3

wow thanks for sharing great information ...Amazing information :roll: