Close

EEG acquisition and test of BCI with motor imagery

A project log for Brainmotic

EEG to control a Smart house. Oriented to persons with disabilities.

daniel-felipe-valencia-vDaniel Felipe Valencia V 10/03/2016 at 07:080 Comments

We have started capturing EEG data with TI ADS1299, eight channels with positions c3, p3, t7, and p7 for the electrodes of odd, channels 1,3,5, and 7; and t4 positions, p4, t8, and p8 for even channels.

The ADS has this configured digital converters differential Analogs, so we have connected all the channels to negative terminal BIAS terminal channels, which it is connected to the earlobe of the person using the device.

The ADS is associated to a bridge RPi3 using an Arduino UNO; the Arduino is responsible for setting the converter to this we have used the code openBCI available at: https://github.com/OpenBCI/OpenBCI-V2hardware-DEPRECATED

The ADS connected to the Arduino and the latter to the RPI; we have begun to compare the amplitudes generated in each of the hemispheres, for example when you think about moving his left arm expect the voltage of the electrodes in the right hemisphere is greater than the left.

We captured 10 seconds of data of the eight channels 250 Samples Per Second, and then process each channel with the FFT, then add the even and odd channels and compare the results of these sums. We use the concept of voltage levels on TTL gates to generate three levels or thresholds, a range that would appear to a high impedance or rest position of the user, another indicating the action "left" similar to an electrical low level and a "right" as the high electric level.

To indicate to the user that the action of left arm has been identified, we will indicate to the user that action was understood; this is done with sounds of "left" and "right" generated by the library python passing text to speech. The action "pause" does not make noise because the user is expected to remain in this state, and it would be annoying to the user that the system maintains sound of "pause."

Finally, we want action left is associated with the button that allows slide on interface options, right action with the button that allows enter the options interface that controls or reads data from thingspeak and sensors or actuators connected to the PSoC microcontrollers, and finally, pause option generates no action on the interface.

Tests with the connection between the user via the ADS and Arduino, and RPI; They have proved satisfactory because the strategy of comparing the voltages of each hemisphere of the brain allowed the RPI interpret the imagined user action correctly, for example when the user thinks about moving the right arm after 10 seconds the RPI answer "right ".

Discussions