An innovative, non-invasive, assistive Human Interface Device for ALS affected people based upon Electrooculography (EOG) technique.

Similar projects worth following
ALS affects more than 5000 people in Italy, and their life is monotone, forced on their wheelchair assisted by their caretaker or relatives. EyesDrive wants to give some independence to any paralyzed people: a pair of glasses with "some special electronics" inside gives the ability to drive a wheelchair with the eyes movement. The whole project it's based upon the Electrooculography technique, already used in health fields for diagnosis, and the subject of research for the consumer field.

EyesDrive is a new, non-invasive, human interface assistive device that uses eyes movement to control any compatible electronic device, such as a wheelchair, the mouse pointer, or a car steering wheel. The project is based upon the Electrooculography technique, already known in medical fields for diagnosis.

Eyes potential

The eye is home to an electric potential, independent from luminous stimuli: it is a fixed dipole, with the positive end on the cornea and the negative one on the retina. The corresponding potential difference is called corneo-retinal standing potential. This voltage is attributed mainly to physiological processes taking place on the retina.

The corneo-retinal standing potential and the eye movement allow the measurement of a signal known as Electrooculogram.

With the eye pointing straight, the electrodes have the same electric potential, so any signal isn't recorded. Moving eyes leads to an electric potential difference between both eyes: the electrode facing the rotation side is positive relative to the other electrode.

Electrooculography has both advantages and disadvantages compared to other eyes position detection systems: the main advantage is the ability of this method to sense eye movement in all scenarios: with or without light and in every environment with the lowest interference.

EyesDrive Frontend

The EyesDrive system was developed with ALS and other movement diseases in mind: it permits, after various signal elaboration processes, eyes movement mapping and makes it understandable to a computer. The EyesDrive heart is its frontend: it transmits the acquired EOG signal via a Bluetooth interface to any connected compatible device. Its distinctive and easy to implement protocol permits to implement EyesDrive assistive technology theoretically to any device possible.

EyesDrive Frontend block diagram

The signal acquisition chain begins from the EOG signal amplification: an Instrumentation Amplifier increases the corneo-retinal potential until the wanted result is reached. The other stages of the analog chain filter and adjust the signal for an optimal quantization of the signal by the microcontroller ADC.

To reduce the 50 Hz hum from the power grid, the common-mode rejection ratio of the instrumentation amplifier isn't enough: an additional circuit, called "Reference Driver" was added and, as the name says, "drives" the user skin to a known potential to reduce the hum present in the body.

Application examples

The EyesDrive system finds use in several "old-style" assistive medical devices: the frontend can control a wheelchair direction, making the use of it more natural. Another use is speech synthesis: the eyes movement can select characters and buttons on a computer screen without the use of the mouse, making more straightforward the control of computers for paralyzed people.

  • Wait, it's working?

    Federico Runco10/04/2019 at 17:06 0 comments

    In the last month I was quite busy, i've started school and there are a lot of other things that waste my time. But this week, thanks to my classmate's bt e-car I've tested the Frontend with something "physical".

    I'm the one with the electrodes and uncombed hair.

    Sometimes it gets stuck because I've put the wrong calibration values, but it kinda worked, and the horizontal axis was swapped because I was too lazy to swap the input cables.

    The thing where the electrodes are connected is the Frontend, while the red light thingy is just a powerbank.

  • The EyesDrive PCB is on its way!

    Federico Runco08/18/2019 at 20:17 0 comments

    And after about 16 not working prototypes made in the past 6 months, I can officially say that this revision of the circuit is the right one, also because it's the only one that worked for 30 min straight without any calibration for about 7 days consecutive. Currently, the frontend can only distinguish between left, center and right position, no blink detection yet.

    To fully test the whole project I made a little (and also my first) script in Python to simulate the computer arrow keys and I've played some random driving games on the internet. In the beginning, it was quite strange, but after a while, it felt almost natural. 
    The last thing I haven't made was a PCB so, after designing it in Eagle (no KiCad, sorry, it's quite hard to understand for me), and I've purchased a prototype PCB from JLCPCB.

    What surprised me about JLCPCB is the price: it's really cheap, I paid 2 dollars without any shipping fee for 5 boards, that's incredible!

    I will probably make a video of the frontend working when the PCB arrives, also because will be in the same period when school begins, then I can also show you some cool waveforms on the oscilloscope.

  • Welcome to the world, analog frontend!

    Federico Runco08/12/2019 at 17:29 0 comments

    So yes, hopefully, the analog frontend is finally completed! Aside from minor modifications (like better opamps, not that shitty 741), the design has been completed. As you can see in the screenshot down here, eye position and blink are both recognizable. The last step is to process the signal in the Arduino. Eye position processing is quite easy, but blinking recognition will be a challenging quest. 

  • AC coupling. That unknown thing

    Federico Runco08/08/2019 at 20:50 0 comments

    In the last days I've finally decided to go to the "nearest" (about 15km far) electronics shop to get the missing components. While I've got almost all passives values wrong, they've not affected as much the results.

    The good thing is that, after days spent thinking why the signal wasn't centered at my reference voltage, I realized I didn't AC coupled the in-amp: electrodes are adding some DC offset to the signal, and a high pass filter will not filter the whole DC out. 

    So I opened LTSpice and redesigned the preamp stage, here's the result (blue signal is the original one, with an offset of 50mA circa):

    Preamp transient analysis
    Preamp transient analysis

    After redesigning the whole preamp stage, the signal is centered at around 2.5V, and now I only need to filter out the signal to complete the design.

    I'm also considering to buy an oscilloscope, probably with it I would have seen the DC offset times ago with the FFT (the LabVIEW FFT is crap) and considered the AC coupling, but nowadays DSO are quite expensive.
    Here's a recording with the actual circuit, and yes, I'm using an Arduino as a myDAQ, also these things are quite expensive. 

    Only the preamp (that is the circuit on the breadboard) uses 3 op-amps without filtering. I don't know how many IC will be used in the completed board.

  • Hello there.

    Federico Runco08/02/2019 at 10:28 0 comments

    That's my first time using for a project of mine. I usually used it to see all those fantastic projects in there.

    About EyesDrive, the signal conditioning circuitry is almost complete. I'm designing the last stages with LabVIEW (at least nothing will take fire if I simulate things, or am I wrong?). I'm currently trying to deal with that baseline drift that affects my signal. I don't know if it's made by those crap electrodes or by some "strange-biological-process" in my body that's drifting my signal.

    The simulated stage seems to be working, but the signal is almost unrecognizable, and I don't know if it will work in a long time.

    Here's a screenshot of the LabVIEW VI: the last graph shows the processed signal, the first graph shows the raw signal captured in a 20-sec span.

View all 5 project logs

Enjoy this project?



Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates