We are using the Odroid for the filtering of the videostream and processing the commands selected with one eye. Concerning the coding we use Python with openCV libraries. This is the firs time we are using a real programming language after moving from Lego Mindstorms NXT. So we decided Python might be easier to learn ;-)
There is an Arduino-Addon for the Ordoid and we use it to control the H-bride we made for driving the motors and for calibrating the eyetracker. Here we use potis as voltage dividers connected to the Arduino's AD-converters for adjusting the Eyetracker to the person's physiognomy.
But first we used a little robotic platform to simulate a wheelchair. In the program the video is filtered a few times (blur, b/w conversion etc.) to get a nice picture with only the pupil left. Its position is calculated and then these coordinates are compared against 4 different areas for the commands to steer the wheelchair: forward, back, left, right.
The boundaries of these areas are changed simply using potentiometers that operate as voltage dividers – the Arduino's AD-converters take care of getting this input.
So you can adjust 2 cut'offs an the x and Y-axis and a threshold for the binarisation of the webcam image. This way you can very easily adjust the tracker to different people to show your work.
To make sure the wheelchair doesn't move just by looking at the eyemovement the corresponding command has to be verified by a small switch – it will later be moved by the tongue. Or it could be done like Stephen Hawking's device, looking at a tiny movement of a muscle in the cheek (IR-reflection e.g.)
We won the first price in the local competition in the German Science Fair and moved on to the regional competition. For that one we have decided to move towards a proper wheelchair as a platform. We bought a second hand wheelchair without motors and started all over again. This time we used the brand new Raspberry Pi 2b with a quadcore CPU because it is much cheaper and we wanted our project to be as cheap as possible to be affordable for everyone. Our aim is that this approach can be used widely and rebuild for people in need. So we were able to compare the speeds between the "old" Raspi (3-4 fps), the Ordroid U3 (12 and up fps) and the new Raspbi 2B (8-9 fps) - not too fast, a bit annoying to wait for the tracker to get the command, but half the price of the Odroid... Perhaps someone could help us using all 4 cores, if possible?
This time we used motors from windscreen wipers we got from trash and relais to switch them on and off and control the direction.
We constructed casings for the camera and some wheels to turn the wheelchair's wheels and 3d-printed the lot ;-)
At the moment we are working on a collision detection system to make the wheelchair more safe...
We will present this project at the national level of the German science fair next week – wish us luck! :-)
You can find the paper for download in my other blog and we will make all files (software, 3d-files etc.) available soon. Everything will be CC BY NC, so everybody can modify, rebuild, improve... but not make money with it, since this is against the original idea ;-)
At the moment the descriptions in our paper are German only; I will work on that soon as well….
Looking forward to your comments ;-)