<texit info> author=NeuroSky, Inc. title=NeuroSky SDK for .NET: Development Guide and API Reference </texit>
This guide will teach you how to use the NeuroSky SDK for .NET to write Windows apps that can utilize bio-signal data from NeuroSky's bio-sensors. This will enable your Windows apps to receive and use bio-signal data such as EEG acquired from NeuroSky's sensor hardware.
This guide (and the entire NeuroSky SDK for .NET for that matter) is intended for programmers who are already familiar with standard .NET development using Microsoft Visual Studio. If you are not already familiar with developing for .NET, please first visit http://www.microsoft.com/net to learn how to set up your .NET development environment and create typical .NET apps.
If you are already familiar with creating typical .NET apps, then the next step is to make sure you have downloaded the NeuroSky SDK for .NET. Chances are, if you're reading this document, then you already have it.
Currently available from: thinkgear_sdk_for_net
Note that we currently have a link on http://developer.neurosky.com that refers to another library called ThinkGearNET written by a 3rd-party developer at http://thinkgearnet.codeplex.com. This should be explained and disambiguated here in this Introduction to reduce confusion.
You'll find the .dll, configuration files and 3rd party license documents in the libs/
folder. Copy this entire folder into your project.
You'll find the source code of the “HelloEEG Sample Project” in the Sample Projects/HelloEEG
folder.
The NeuroSky SDK for .NET can only be used with the following NeuroSky devices, chips, and modules:
Before using any Windows app that uses the NeuroSky SDK for .NET, make sure you have paired the sensor hardware device to your Windows machine by carefully following the instructions in the User Manual that came with each sensor hardware device! The sensor hardware device must properly appear in your Windows machine's list of COM ports in Device Manager.
HelloEEG is a sample project we've included in the NeuroSky SDK for .NET that demonstrates how to setup, connect, and handle data to a NeuroSky device. Add the project to your Visual Studio by following these steps:
These steps have been tested with Visual Studio 2010, if yours is different you may have to adapt these instructions.
The TG-HelloEEG.exe reference program is built from these same sources and with the same process. It is slightly different in that the Microsoft ILMerge program has been used to incorporate the dlls from the /neuosky folder into the .exe so that it can function in a more standalone way.
The NeuroSky SDK for .NET's API is made available to your application via the NeuroSky.ThinkGear
namespace. The ThinkGear.dll gives your .NET application access to the NeuroSky.ThinkGear
namespace.
To start with, add the ThinkGear.dll file to your .NET application's project workspace. The ThinkGear.dll is a C# .NET library, and can only be used as part of .NET projects (it will not work in native projects). This .dll contains the NeuroSky.ThinkGear
namespace.
The NeuroSky SDK for .NET's API is made available to your application via the NeuroSky.ThinkGear
namespace. Once you have added the ThinkGear.dll file to your project, you can then add the following code to the top of your application to access the NeuroSky.ThinkGear
namespace:
using NeuroSky.ThinkGear;
The NeuroSky.ThinkGear
namespace consists of two classes:
Connector
- Connects to the computer's serial COM port and reads in the port's serial stream of data as DataRowArrays. TGParser
- Parses a DataRowArray into recognizable ThinkGear Data Types that your application can use.
To use the classes, first declare a Connector
instance and initialize it:
private Connector connector; connector = new Connector();
DeviceFound
events once the Connector is constructed? If so, this should be mentioned, and the handler for DeviceFound
should be described below.
Next, create EventHandler
s to handle each type of Connector Event, and link those handlers to the Connector
events.
connector.DeviceConnected += new EventHandler( OnDeviceConnected ); connector.DeviceFound += new EventHandler( OnDeviceFound ); connector.DeviceNotFound += new EventHandler( OnDeviceNotFound ); connector.DeviceConnectFail += new EventHandler( OnDeviceNotFound ); connector.DeviceDisconnected += new EventHandler( OnDeviceDisconnected ); connector.DeviceValidating += new EventHandler( OnDeviceValidating );
In the handler for the DeviceConnected
event, you should create another EventHandler to handle DataReceived
events from the Device
, like this:
void OnDeviceConnected( object sender, EventArgs e ) { Connector.DeviceEventArgs deviceEventArgs = (Connector.DeviceEventArgs)e; Console.WriteLine( "New Headset Created." + deviceEventArgs.Device.DevicePortName ); deviceEventArgs.Device.DataReceived += new EventHandler( OnDataReceived ); }
Now, whenever data is received from the device, the DataReceived
handler will process that data. Here is an example OnDeviceReceived()
that shows how it can do this, using a TGParser
to parse the DataRow[]
:
void OnDataReceived( object sender, EventArgs e ){ /* Cast the event sender as a Device object, and e as the Device's DataEventArgs */ Device d = (Device)sender; Device.DataEventArgs de = (Device.DataEventArgs)e; /* Create a TGParser to parse the Device's DataRowArray[] */ TGParser tgParser = new TGParser(); tgParser.Read( de.DataRowArray ); /* Loop through parsed data TGParser for its parsed data... */ for( int i=0; i<tgParser.ParsedData.Length; i++ ) { // See the Data Types documentation for valid keys such // as "Raw", "PoorSignal", "Attention", etc. if( tgParser.ParsedData[i].ContainsKey("Raw") ){ Console.WriteLine( "Raw Value:" + tgParser.ParsedData[i]["Raw"] ); } if( tgParser.ParsedData[i].ContainsKey("PoorSignal") ){ Console.WriteLine( "PQ Value:" + tgParser.ParsedData[i]["PoorSignal"] ); } if( tgParser.ParsedData[i].ContainsKey("Attention") ) { Console.WriteLine( "Att Value:" + tgParser.ParsedData[i]["Attention"] ); } if( tgParser.ParsedData[i].ContainsKey("Meditation") ) { Console.WriteLine( "Med Value:" + tgParser.ParsedData[i]["Meditation"] ); } } }
When you would like to begin the Familiarity and/or Mental Effort 1) calculations, use the connector
to enable them:
connector.setTaskFamiliarityEnable(true); connector.setMentalEffortEnable(true);
Once you have the handlers set up as described above, you can have your Connector
actually connect to a device/headset/COM port by using one of the Connect methods described in Connect to a device below. If the portName
is valid and the connection is successful, then your OnDataReceived() method will automatically be called and executed whenever data arrives from the headset.
Before exiting, your application must close the Connector
's open connections by calling the Connector
's close()
method.
connector.close();
If close()
is not called on an open connection, and that connection's process is still alive (i.e. a background thread, or a process that only closed the GUI window without terminating the process itself), then the headset will still be connected to the process, and no other process will be able to connect to the headset until it is disconnected.
If you choose to connect by stating a specific COM port, it will take the following steps:
If you choose to connect by using the AUTO approach, it will take the following steps:
portName
that was able to successfully connect, and try to connect to that same portName
first the next time a connection attempt is made. If that remembered portName
is no longer valid or unable to connect, then you can use ConnectScan( string portName )
method to find another valid portName
.
Attempts to open a connection with the port name specified by portName
. Calling this method results in one of two events being broadcasted:
Attempts to open a connection to the first Device seen by the Connector. Calling this method results in one of two events being broadcasted:
Same as ConnectScan but scans the port specified by portName first. Calling this method results in one of two events being broadcasted:
Closes all open connections. Calling this method will result in the following event being broadcasted for each open device:
Closes a specific Connection specified by connection
. Calling this method will result in the following event being broadcasted for a specific open device:
Closes a specific Device specified by device
. Calling this method will result in the following event being broadcasted for a specific open device:
Sends an array of bytes to a specific port
Starts recording data for 60 seconds. Once the recording is complete, the Mental Effort will be calculated. Note: the first time the Mental Effort 2) is calculated, the result is 0.
Starts recording data for 60 seconds. Once the recording is complete, the Familiarity will be calculated. Note: the first time the Familiarity is calculated, the result is 0.
Occurs when a ThinkGear device is found. This is where the application chooses to connect to that port or not.
Occurs when a ThinkGear device could not be found. This is usually where the application displays an error that it did not find any device.
Occurs right before the connector attempts a serial port. Mainly used to notify the GUI which port it is trying to connect.
Occurs when a ThinkGear device is connected. This is where the application links the OnDataReceived for that device.
Occurs when the Connector fails to connect to that port specified.
Occurs when the Connector disconnects from a ThinkGear device.
Occurs when data is available from a ThinkGear device.
close()
method here, or was the description in the example at the top supposed to refer to Disconnect()
?
The TGParser class is used to convert the received data into easily accessible data contained in a Dictionary.
Parses the raw headset data in dataRow
and returns a dictionary of usable data. It also stores the dictionary in the ParsedData
property.
When connected to a MindSet, MindWave, or MindWave Mobile headset, the Read()
method can return the following standard keys in its dictionary:
Key | Description | Data Type |
---|---|---|
Time | TimeStamps of packet received | double |
Raw | Raw EEG data | short |
EegPowerDelta | Delta Power | uint |
EegPowerTheta | Theta Power | uint |
EegPowerAlpha1 | Low Alpha Power | uint |
EegPowerAlpha2 | High Alpha Power | uint |
EegPowerBeta1 | Low Beta Power | uint |
EegPowerBeta2 | High Beta Power | uint |
EegPowerGamma1 | Low Gamma Power | uint |
EegPowerGamma2 | High Gamma Power | uint |
Attention | Attention eSense | double |
Meditation | Meditation eSense | double |
Zone | performance Zone | double |
PoorSignal | Poor Signal | double |
BlinkStrength | Strength of detected blink. The Blink Strength ranges from 1 (small blink) to 255 (large blink). Unless a blink occurred, nothing will be returned. Blinks are only calculated if PoorSignal is less than 51. | uint |
Familiarity (BETA) | Familiarity measures how well the subject is learning a new task or how well his performance is with certain task. Familiarity algorithm can be used for both within-trial monitoring (continuous real-time tracking) and between-trial comparison. A trial can be of any length equal to or more than 1 minute. In each trial, the first output index will be given out after the first minute and new output indexes will then be generated at time interval defined by the output rate (default: 10s). | double |
Mental Effort (BETA) | Mental Effort measures how hard the subject’s brain is working, i.e. the amount of workload involved in the task. Mental Effort algorithm can be used for both within-trial monitoring (continuous real-time tracking) and between-trial comparison. A trial can be of any length equal to or more than 1 minute. In each trial, the first output index will be given out after the first minute and new output indexes will then be generated at time interval defined by the output rate (default: 10s). | double |
MindWandering | Mind Wandering Level. The Mind Wandering Level ranges from 1 (low Mind Wandering) to 10 (high Mind Wandering). The Mind Wandering algorithm updates once every 0.5 seconds, assuming PoorSignal is less than 51. If PoorSignal is above 51, nothing is returned. | double |
When connected to a ThinkCap, the Read()
method can return the following keys in its dictionary:
Key | Description | Data Type |
---|---|---|
Time | TimeStamps of packet received | double |
RawCh1 | EEG Channel 1 | short |
RawCh2 | EEG Channel 2 | short |
RawCh3 | EEG Channel 3 | short |
RawCh4 | EEG Channel 4 | short |
RawCh5 | EEG Channel 5 | short |
RawCh6 | EEG Channel 6 | short |
RawCh7 | EEG Channel 7 | short |
RawCh8 | EEG Channel 8 | short |
The ThinkGear data types are generally divided into two groups: data types that are only applicable for EEG sensor devices, and data types that are typically applicable to all ThinkGear-based devices.
These data types are generally available from most or all types of ThinkGear hardware devices.
This integer value provides an indication of how good or how poor the bio-signal is at the sensor. This value is typically output by all ThinkGear hardware devices once per second.
This is an extremely important value for any app using ThinkGear sensor hardware to always read, understand, and handle. Depending on the use cases for your app and users, your app may need to alter the way it uses other data values depending on the current value of POOR_SIGNAL/SIGNAL_STATUS
. For example, if this value is indicating that the bio-sensor is not currently contacting the subject, then any received RAW_DATA or EEG_POWER values during that time should be treated as floating noise not from a human subject, and possibly discarded based on the needs of the app. The value should also be used as a basis to prompt the user to possibly adjust their sensors, or to put them on in the first place.
This updated version converts poorSignal values read from different hardware devices. It converts them into a uniform format. (unlike earlier version of the SDK) If you have software that reacts to the poorSignal value, you should evaluate that software to see if changes need to be made
For EEG sensor hardware: A value of 0 indicates that the bio-sensor is not able to detect any obvious problems with the signal at the sensor. Higher values from 1 to 199 indicate increasingly more detected problems with the signal. A value of 200 means the sensor contacts detect that they are not even all properly in contact with a conductive subject (for example, the EEG headset may currently not even be worn on any person's head).
For ECG/EKG (CardioChip) sensor hardware: A value of 200 indicates the bio-sensor contacts are all currently in contact with a conductive subject (such as a user's skin), while a value of 0 indicates the opposite: that not all the contacts are in proper contact with a conductive subject.
Poor signal may be caused by a number of different things. In order of severity, they are:
For EEG modules, a certain amount of noise is unavoidable in normal usage of ThinkGear sensor hardware, and both NeuroSky's filtering technology and algorithms have
been designed to detect, correct, compensate for, account for, and tolerate
many types of signal noise. Most typical users who are only interested
in using the eSense™ values, such as Attention and Meditation, do not need
to worry as much about the POOR_SIGNAL
Quality value, except to note that
the Attention and Meditation values will not be updated while POOR_SIGNAL
is greater than zero, and that the headset is not being worn while POOR_SIGNAL
is higher than 128. The POOR_SIGNAL
Quality value is more useful to some
applications which need to be more sensitive to noise (such as some medical
or research applications), or applications which need to know right away
when there is even minor noise detected.
I have keep most of your thoughts below for the background material.
Thanks for making the adjustments, looks good! You're right, our biggest problem right now is that we don't have a systematic way yet to tell what kind of sensor we are reading from. We can put intelligence into our SDK to guess based on the data we're seeing, but it's somewhat iffy and ad-hoc since some flavors of the hardware sometimes aren't very well defined or well differentiated from one another, and can also introduce troublesome timing complexities. I've been preparing to formally tackle this problem for years, by defining an ID that the hardware can send out to identify itself, and by pushing the hardware guys to clearly define the output characteristics of the hardware in their specs, but it requires the hardware guys to actually implement it in their hardware and specs. One of our challenges and duties as the SDK guys will be to aggressively push our hardware guys to implement features like these into new hardware they develop, so that we can correct such deficiencies at the SDK and app level. I haven't been aggressive enough. — Kelvin Soo 2012/09/26 00:22
This name is in transition. Historically, it was called “Poor Signal Status” for our original EEG chips and modules, because the higher the value being reported, the “poorer” the signal was. A value of 0 meant it that no “poorness” was detected. However, when the ASIC team implemented the BMD10X chips, they chose to invert the semantics of the values (so that 200 meant good and 0 meant bad), and subsequently named their value to be SENSOR_STATUS, replacing the functionality of POOR_SIGNAL for that chip.
That's all historical now. We've agreed that going forward, using the term SENSOR_STATUS is better for all involved (since many developers had a slightly difficult time wrapping their head around the concept of “poorness”). I'd like to eventually deprecate the term POOR_SIGNAL, I just haven't decided if I want to bite the bullet and do so in this release yet. Changing the constant value would break other people's existing code, so we have to do it carefully somehow.
The other point is that the mechanism for detecting 200 is entirely different, it is a special value that is looking for an impedance level so high that suggests an open circuit. So it is not values “near” 200, but exactly the value 200 (the other values actually don't ever actually exceed 107 in current chips). On the other side of the coin, only a value of exactly 0 is actually acceptable for most applications, including our eSense™ algorithms (they only operate when the POOR_SIGNAL is 0).
The final thing to be careful of is the “watermelon test” problem. We have been criticized for not being able to tell the difference between placing the sensors on a person's head, versus placing them on a watermelon, which obviously has no brainwave. The truth is, the electrical noise on the surface of a watermelon can often actually resemble the electrical characteristics of a brainwave, so our “noise detection” often cannot tell the two apart, and we didn't want to make it so stringent that we start getting false negatives when on a person's head. By explicitly calling it “noise detection” in this data type, we risk people criticizing that our “noise detection” doesn't work, since it can't detect watermelon “noise”. This is why I chose the wording of “obvious problems” instead of “noise”. Because of the similarity between watermelon noise and brainwave, it is not an “obvious problem” we can detect.
— Kelvin Soo 2012/09/25 08:28
This data type supplies the raw sample values acquired at the bio-sensor. The sampling rate (and therefore output rate), possible range of values, and interpretations of those values (conversion from raw units to volt) for this data type are dependent on the hardware characteristics of the ThinkGear hardware device performing the sampling. You must refer to the documented development specs of each type of ThinkGear hardware that your app will support for details.
As an example, the majority of ThinkGear devices sample at 512Hz, with a possible value range of -32768 to 32767.
As another example, to convert TGAT-based EEG sensor values (such as TGAT, TGAM, MindWave Mobile, MindWave, MindSet) to voltage values, use the following conversion:
(rawValue * (1.8/4096)) / 2000
This data type is not currently used by any current commercially-available ThinkGear products. It is kept here for backwards compatibility with some end-of-life products, and as a placeholder for possible future products.
These data types are only available from EEG sensor hardware devices, such as the MindWave Mobile, MindSet, MindBand, and TGAM chips and modules.
This int value reports the current eSense™ Attention meter of the user, which indicates the intensity of a user's level of mental “focus” or “attention”, such as that which occurs during intense concentration and directed (but stable) mental activity. Its value ranges from 0 to 100. Distractions, wandering thoughts, lack of focus, or anxiety may lower the Attention meter levels. See eSense Meters below for details about interpreting eSense™ levels in general.
By default, output of this Data Value is enabled. It is typically output once a second.
This unsigned one-byte value reports the current eSense™ Meditation meter of the user, which indicates the level of a user's mental “calmness” or “relaxation”. Its value ranges from 0 to 100. Note that Meditation is a measure of a person's mental levels, not physical levels, so simply relaxing all the muscles of the body may not immediately result in a heightened Meditation level. However, for most people in most normal circumstances, relaxing the body often helps the mind to relax as well. Meditation is related to reduced activity by the active mental processes in the brain, and it has long been an observed effect that closing one's eyes turns off the mental activities which process images from the eyes, so closing the eyes is often an effective method for increasing the Meditation meter level. Distractions, wandering thoughts, anxiety, agitation, and sensory stimuli may lower the Meditation meter levels. See eSense Meters below for details about interpreting eSense™ levels in general.
By default, output of this Data Value is enabled. It is typically output once a second.
For all the different types of eSense™ (i.e. Attention, Meditation), the meter value is reported on a relative eSense™ scale of 1 to 100. On this scale, a value between 40 to 60 at any given moment in time is considered “neutral”, and is similar in notion to “baselines” that are established in conventional EEG measurement techniques (though the method for determining a ThinkGear baseline is proprietary and may differ from conventional EEG). A value from 60 to 80 is considered “slightly elevated”, and may be interpreted as levels being possibly higher than normal (levels of Attention or Meditation that may be higher than normal for a given person). Values from 80 to 100 are considered “elevated”, meaning they are strongly indicative of heightened levels of that eSense™.
Similarly, on the other end of the scale, a value between 20 to 40 indicates “reduced” levels of the eSense™, while a value between 1 to 20 indicates “strongly lowered” levels of the eSense™. These levels may indicate states of distraction, agitation, or abnormality, according to the opposite of each eSense™.
This value reports the current performance Zone of the subject. It's value ranges from 0 to 9. And this value is sent only when the subject transitions from one Zone to another.
This algorithm uses the Attention and Mediation values to guide a subject to their best performance.
To be in the Elite Zone (9), the subject must hold their Attention level a value of at least 94 and simultaneously holding their Meditation level steady or increasing.
To be in the Intermediate Zone (5), the subject must hold their Attention level a value of at least 64 and simultaneously holding their Meditation level steady or increasing.
To be in the Beginner Zone (1), the subject must hold their Attention level a value of at least 28 and simultaneously holding their Meditation level steady or increasing.
The Not Ready Zone (0) is all Attention levels below 28 and subjects from the Beginner Zone whose Meditation levels are decreasing.
All Zone calculations are suspended and values reset if the sensor doesn't appear to be in good contact with a human.
This is a different implementation of performance Zone compared to other NeuroSky products. Reference: Golf Putting Training Algorithm v 2.0, September 2012, Dr. KooHyoung Lee.
This int value reports the intensity of the user's most recent eye blink. Its value ranges from 1 to 255 and it is reported whenever an eye blink is detected. The value indicates the relative intensity of the blink, and has no units.
The Detection of Blinks must be enabled.
if (setBlinkDetectionEnabled(true)) { // return true, means success Console.WriteLine("HelloEEG: BlinkDetection is Enabled"); } else { // return false, meaning not supported because: // + connected hardware doesn't support // + conflict with another option already set // + not support by this version of the SDK Console.WriteLine("HelloEEG: BlinkDetection can not be Enabled"); }
The current configuration can be retrieved.
if (getBlinkDetectionEnabled()) { // return true, means it is enabled Console.WriteLine("HelloEEG: BlinkDetection is configured"); } else { // return false, meaning not currently configured Console.WriteLine("HelloEEG: BlinkDetection is NOT configured"); }
If these methods are called before the MSG_MODEL_IDENTIFIED
has been received, it is considered a request to be processed when the connected equipment is identified. It is possible to Enable this feature and later find that it is no longer enabled. Once the connected equipment has been identified, if the request is incompatible with the hardware or software it will be overridden and the
MSG_ERR_CFG_OVERRIDE
message sent to provide notification.
This Data Value represents the current magnitude of 8 commonly-recognized types of EEG frequency bands.
The eight EEG powers are: delta (0.5 - 2.75Hz), theta (3.5 - 6.75Hz), low-alpha (7.5 - 9.25Hz), high-alpha (10 - 11.75Hz), low-beta (13 - 16.75Hz), high-beta (18 - 29.75Hz), low-gamma (31 - 39.75Hz), and mid-gamma (41 - 49.75Hz). These values have no units and are only meaningful for comparison to the values for the other frequency bands within a sample.
By default, output of this Data Value is enabled, and it is output approximately once a second.
This data type is not currently used by any current commercially-available ThinkGear products. It is kept here for backwards compatibility with some end-of-life products, and as a placeholder for possible future products.
Values -100 to +100, indicates that the subject is attentive, the more negative values mean the subject is less attentive and the more positive values mean more attentive.
This feature is not currently available.
Positivity can not be used concurrently with Familiarity or Mental Effort.
if (setPositivityEnable(true)) { // return true, means success Console.WriteLine( "HelloEEG: Positivity is Enabled"); } else { // return false, meaning not supported because: // + connected hardware doesn't support // + conflict with another option already set // + not support by this version of the SDK Console.WriteLine( "HelloEEG: Positivity can not be Enabled"); }
The current configuration can be retrieved.
if (getPositivityEnable()) { // return true, means it is enabled Console.WriteLine( "HelloEEG: Positivity is configured"); } else { // return false, meaning not currently configured Console.WriteLine( "HelloEEG: Positivity is NOT configured"); }
If these methods are called before the MSG_MODEL_IDENTIFIED
has been received, it is considered a request to be processed when the connected equipment is identified. It is possible to Enable this feature and later find that it is no longer enabled. Once the connected equipment has been identified, if the request is incompatible with the hardware or software it will be overridden and the
MSG_ERR_CFG_OVERRIDE
message sent to provide notification.
This algorithm is resource and computation intensive. If you need to run with the Debugger, be aware that this calculation may take many minutes to complete when the debugger is engaged. It will complete and present it's results. Without the debugger engaged, this calculation should complete in a few seconds.
When applied to single-channel EEG data collected from the forehead area, the Familiarity algorithm measures how well a subject is learning a task, and correlates with how well his performance is with certain tasks. The Familiarity algorithm works well with both motor (e.g. drawing) and mental (e.g. reciting) tasks. The algorithm can be used for monitoring a subject's mental Familiarity curve and changes in realtime (to see how their Familiarity Index changes as they're performing the task), or it can be used for studying their Familiarity levels on a task across separate sessions or days.
The Familiarity algorithm must be applied to at least 1 minute of EEG data. A Familiarity Index will be output by the algorithm after the first 60s, and then every N seconds after that (N is the predefined output rate; default: 10s).
A typical way to use the Familiarity algorithm is for an application to prompt a subject to be relaxing, with their eyes open, doing nothing, for the first 60s while taking the initial measurement. The first Familiarity Index value (reported after the first 60s) is kept by the application as a baseline, against which subsequent Familiarity Index values (by default received every 10s) can be compared to determine relative percent changes (e.g. after relaxing for the first minute, the user starts doing a specific task. Their Familiarity Index goes up to +20% in the first 10s of the task compared to the first minute baseline, and then goes up to +35% in the next 10s of the task compared to the first minute baseline. After 40s, their Familiarity Index may dip to -15% from the first minute baseline, which may indicate they are no longer learning and absorbing the task as well, or may no longer be performing optimally at the task.)
Alternatively, another possible baseline measurement could be, instead of relaxing and doing nothing during the first 60s, to rather have the subject start doing the task right away and regard the initial part of the task as the baseline. You would then interpret the percentage changes accordingly, knowing that the baseline is based on the initial 60s engaging in the task.
For presenting, studying, and interpreting the data, each reported Familiarity Index (FI) value can be compared for percent changes either (1) against the baseline value, or (2) against the FI value immediately preceding it.
The Familiarity Index is a floating point number with arbitrary units. This means the FI numbers have no meaning on their own, and take on meaning only when comparing percentage changes between two or more values.
An example on how to use the algorithm could be found in the sample application – “HelloEEG”.
A few example tasks that the algorithm can to: painting (“Color Pages” available in <http://coloringkid.net/>) and dancing (“Just Dance” available in <http://www.youtube.com/watch?v=4Hh7FY3n3wM>).
To use the Familiarity algorithm in the NeuroSky SDK/API library, it must first be enabled:
if (setTaskFamiliarityEnable(true)) { // return true, means success Console.WriteLine("HelloEEG: Familiarity is Enabled"); } else { // return false, meaning not supported because: // + connected hardware doesn't support // + conflict with another option already set // + not support by this version of the SDK Console.WriteLine("HelloEEG: Familiarity can not be Enabled"); }
At any time, the status of the algorithm (whether it is enabled) can be queried:
if (getTaskFamiliarityEnable()) { // return true, means it is enabled Console.WriteLine("HelloEEG: TaskFamiliarity is configured"); } else { // return false, meaning not currently configured Console.WriteLine("HelloEEG: TaskFamiliarity is NOT configured"); }
Using just setTaskFamiliarityEnable(true)
, the algorithm will be executed exactly one time. The execution begins as soon as 60 seconds of good data has been collected. After the results have been reported, the algorithm becomes automatically disabled.
It is possible to configure the algorithm to run continuously:
if (setTaskFamiliarityRunContinuous(true)) { // return true, means success Console.WriteLine("HelloEEG: Familiarity Continuous operation"); } else { // return false, meaning not supported because: // + connected hardware doesn't support // + conflict with another option already set // + not support by this version of the SDK Console.WriteLine("HelloEEG: Familiarity normal operation "); }
Enabling Continuous Mode does not automatically also enable the algorithm itself; you must still call setTaskFamiliarityEnable(true)
yourself.
The current Continuous Mode configuration can be retrieved:
if (getTaskFamiliarityRunContinuous()) { // return true, means it is enabled Console.WriteLine("HelloEEG: Familiarity Continuous operation"); } else { // return false, meaning not currently configured Console.WriteLine("HelloEEG: Familiarity normal operation"); }
If these methods are called before the MSG_MODEL_IDENTIFIED
has been received, it is considered a request to be processed when the connected equipment is identified. It is possible to Enable this feature and later find that it is no longer enabled. Once the connected equipment has been identified, if the request is incompatible with the hardware or software it will be overridden and the
MSG_ERR_CFG_OVERRIDE
message sent to provide notification.
This algorithm is resource and computation intensive. If you need to run with the Debugger, be aware that this calculation may take many minutes to complete when the debugger is engaged. It will output its results only after its calculations are complete.
When applied to single-channel EEG data collected from forehead area, the Mental Effort algorithm measures the amount of workload exerted by subject's brain while performing a task (how hard the subject’s brain is working while performing the task). The Mental Effort algorithm works well with both motor (e.g. drawing) and mental (e.g. reciting) tasks. The algorithm can be used for monitoring a subject's Mental Effort curve and changes in realtime (to see how their Mental Effort Index changes as they're performing the task), or it can be used for studying their Mental Effort levels on a task across separate sessions or days.
The Mental Effort algorithm must be applied to at least 1 minute of EEG data. A Mental Effort Index will be output by the algorithm after the first 60s, and then every N seconds after that (N is the predefined output rate; default: 10s).
A typical way to use the Mental Effort algorithm is for an application to prompt a subject to be relaxing, with their eyes open, doing nothing, for the first 60s while taking the initial measurement. The first Mental Effort Index value (reported after the first 60s) is kept by the application as a baseline, against which subsequent Mental Effort Index values (by default received every 10s) can be compared to determine relative percent changes (e.g. after relaxing for the first minute, the user starts doing a specific task. Their Mental Effort Index decreases by -20% in the first 10s of the task compared to the first minute baseline, and then goes down further to -35% in the next 10s of the task compared to the first minute baseline, indicating their brain is doing less and less work in the first 20s. After 40s, their Mental Effort Index goes back up to -5% from the first minute baseline, which may indicate their brain is starting to do more work again at that point.)
Alternatively, another possible baseline measurement could be, instead of relaxing and doing nothing during the first 60s, to rather have the subject start doing the task right away and regard the initial part of the task as the baseline. You would then interpret the percentage changes accordingly, knowing that the baseline is based on the initial 60s engaging in the task.
For presenting, studying, and interpreting the data, each reported Mental Effort Index (MEI) value can be compared for percent changes either (1) against the baseline value, or (2) against the MEI value immediately preceding it.
The Mental Effort Index is a floating point number with arbitrary units. This means the MEI numbers have no meaning on their own, and take on meaning only when comparing percentage changes between two or more values.
An example on how to use the algorithm could be found in the sample application – “HelloEEG”.
A few example tasks that the algorithm can to: arithmetic calculation (“Math 24” available in http://www.24theory.com/, and “Addition Aliens Attack” available in <http://www.imathgame.com/ChromeAddition.php>)
To use the Mental Effort algorithm 3) in the NeuroSky SDK/API library, it must first be enabled:
if (setMentalEffortEnable(true)) { // return true, means success Console.WriteLine("HelloEEG: MentalEffort is Enabled"); } else { // return false, meaning not supported because: // + connected hardware doesn't support // + conflict with another option already set // + not support by this version of the SDK Console.WriteLine("HelloEEG: MentalEffort can not be Enabled"); }
At any time, the status of the algorithm (whether it is enabled) can be queried:
if (getMentalEffortEnable()) { // return true, means it is enabled Console.WriteLine("HelloEEG: MentalEffort is configured"); } else { // return false, meaning not currently configured Console.WriteLine("HelloEEG: MentalEffort is NOT configured"); }
Using just setMentalEffortEnable(true)
, the algorithm will be executed exactly one time. The execution begins as soon as 60 seconds of good data has been collected. After the results have been reported, the algorithm becomes automatically disabled.
It is possible to configure the algorithm to run continuously:
if (setMentalEffortRunContinuous(true)) { // return true, means success Console.WriteLine("HelloEEG: MentalEffort Continuous operation"); } else { // return false, meaning not supported because: // + connected hardware doesn't support // + conflict with another option already set // + not support by this version of the SDK Console.WriteLine("HelloEEG: MentalEffort normal operation "); }
Enabling Continuous Mode does not automatically also enable the algorithm itself; you must still call setMentalEffortEnable(true)
yourself.
The current Continuous Mode configuration can be retrieved.
if (getMentalEffortRunContinuous()) { // return true, means it is enabled Console.WriteLine("HelloEEG: MentalEffort Continuous operation"); } else { // return false, meaning not currently configured Console.WriteLine("HelloEEG: MentalEffort normal operation"); }
If these methods are called before the MSG_MODEL_IDENTIFIED
has been received, it is considered a request to be processed when the connected equipment is identified. It is possible to Enable this feature and later find that it is no longer enabled. Once the connected equipment has been identified, if the request is incompatible with the hardware or software it will be overridden and the
MSG_ERR_CFG_OVERRIDE
message sent to provide notification.
This algorithm is resource and computation intensive. If you need to run with the Debugger, be aware that this calculation may take many minutes to complete when the debugger is engaged. It will output its results only after its calculations are complete.
Before releasing an app for real-world use, make sure your app considers or handles the following:
MSG_STATE_CHANGE
Message with any value other than STATE_CONNECTING
or STATE_CONNECTED
, it should carefully handle each possible error situation with an appropriate message to the user via the app's UI. Not handling these error cases well in the UI almost always results in an extremely poor user experience of the app. Here are some examples:STATE_ERR_BT_OFF
Message is received, the user should be prompted to turn on their Bluetooth adapter, and then they can try again. STATE_ERR_NO_DEVICE
Message is received, the user should be reminded to first pair their ThinkGear hardware device to their Android device's Bluetooth, according to the instructions they received with their ThinkGear hardware device.STATE_NOT_FOUND
Message is received, the user should be reminded to check that their ThinkGear hardware device is properly paired to their Android device (same as the STATE_ERR_NO_DEVICE
case), and if so, that their ThinkGear hardware device is turned on, in range, and has enough battery or charge.There are currently no known issues. If you encounter any bugs or issues, please visit http://support.neurosky.com, or contact [email protected].
If you need further help, you may visit http://developer.neurosky.com to see if there is any new information.
To contact NeuroSky for support, please visit http://support.neurosky.com, or send email to [email protected].
For developer community support, please visit our community forum on http://www.linkedin.com/groups/NeuroSky-Brain-Computer-Interface-Technology-3572341
The algorithms included in this SDK are solely for promoting the awareness of personal wellness and health and are not a substitute for medical care. The algorithms are not to be used to diagnose, treat, cure or prevent any disease, to prescribe any medication, or to be a substitute for a medical device or treatment. In some circumstances, the algorithm may report false or inaccurate results. The descriptions of the algorithms or data displayed in the SDK documentation, are only examples of the particular uses of the algorithms, and NeuroSky disclaims responsibility for the final use and display of the algorithms internally and as made publically available.
The algorithms may not function well or may display accurate data if the user has a pacemaker.
THE ALGORITHMS MUST NOT BE USED FOR ANY ILLEGAL USE, OR AS COMPONENTS IN LIFE SUPPORT OR SAFETY DEVICES OR SYSTEMS, OR MILITARY OR NUCLEAR APPLICATIONS, OR FOR ANY OTHER APPLICATION IN WHICH THE FAILURE OF THE ALGORITHMS COULD CREATE A SITUATION WHERE PERSONAL INJURY OR DEATH MAY OCCUR. YOUR USE OF THE SOFTWARE DEVELOPMENT KIT, THE ALGORITHMS AND ANY OTHER NEUROSKY PRODUCTS OR SERVICES IS “AS-IS,” AND NEUROSKY DOES NOT MAKE, AND HEREBY DISCLAIMS, ANY AND ALL OTHER EXPRESS AND IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE, AND ANY WARRANTIES ARISING FROM A COURSE OF DEALING, USAGE, OR TRADE PRACTICE.
IN NO EVENT SHALL NEUROSKY BE LIABLE FOR ANY SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES, INCLUDING BUT NOT LIMITED TO LOSS OF PROFITS OR INCOME, WHETHER OR NOT NEUROSKY HAD KNOWLEDGE, THAT SUCH DAMAGES MIGHT BE INCURRED.
Your code to initialize the btAdapter
and tgDevice
must first enable the Personalization feature.
It must also enable writing to the Android file system. SEE APPENDIX D.
public void onCreate(Bundle savedInstanceState) { //... btAdapter = BluetoothAdapter.getDefaultAdapter(); if( btAdapter != null ) { TGDevice.ekgPersonalizationEnabled = true; tgDevice = new TGDevice( btAdapter, handler ); } //... }
Copyright © 2004-2011 Jaroslaw Kowalski jaak@jkowalski.net
All rights reserved.
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright notice,
this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
* Neither the name of Jaroslaw Kowalski nor the names of its
contributors may be used to endorse or promote products derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS “AS IS” AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.