Close
0%
0%

20€ DIY-Eyetracker for school projects

Build an inexpensive eyetracker with primary school students - and control real world robots with block-based coding!

Similar projects worth following
It is therefore very important to get them excited about STEM topics as early as possible so they can participate equally as adults and shape their future!

In this project, primary school pupils learn that the gaze can be enough to give control commands by using a camera to evaluate eye movement.
They even learn to build such a device, an "eye tracker", from safety glasses and a simple USB camera.

The eyetracker built by students can then be used to discuss other topics such as assistance systems - the robot could, after all, also be a wheelchair controlled only by eye movements.
The project is intended to give teachers and young pupils an opportunity to experiment with this fascinating technology - all they need is block-based coding.

Secondary schools can build an advanced version, disassemble the camera, and use Python with openCV to analyze the images.

Overview

This project shows how to build and use a simple and inexpensive eyetracker - primary school students (starting age 9 ) can build and use it in Scratch3 (block-based coding) environments.
It can also be extended to a sophisticated version involving 3D printing and soldering - for secondary school students, and be used with OpenCV and Python coding.

It uses a cheap endoscope USB camera and a frame from a security camera. For the advanced version, IR illumination is added, the IR-blocking filter is removed from the small camera and everything is mounted inside a small frame.

The project also covers how to use this build to control real robots, so students can build (simulate) medical support like wheelchairs or control a computer with their gaze.

Background

Eye trackers observe the eye with a camera and record the movements of the eyeball and the blinking of the eyelids. Contrary to our everyday perception, this movement occurs erratically (saccades), and only in the moments when a point is fixed a conscious perception can take place.

The sequence in which individual elements of a picture or a real-world scene are viewed and the duration with which they are viewed provide important clues to their meaning and the processing of information in the brain. The evaluation of the eye position is automated. High-quality, professional systems for research applications can often record both eyes simultaneously and offer a high sampling rate (>60fps) so that even fast movements can be recorded. The range of possible applications is correspondingly large and extends from medical topics to information processing of stimuli to simpler marketing studies and optimization of websites, advertising spaces, and product presentations.

The possibilities presented in this tutorial for building one's own eye tracker and using it for projects do not achieve the high sampling rate as professional ones (maybe up to 30 fps), but are easy for pupils to replicate and can also be used in their own programs. For example, games or robots can be controlled with graphical programming in Scratch - or you can use openCV for more sophisticated projects.

  • 1 × Endoscope camera with USB connection (approx. 8 €)
  • 1 × Safety glasses with lenses that can be replaced - we just need the frame
  • 1 × 15-20 cm copper wire 2.5mm2 this will just provide stability for the "camera arm"
  • 1 × Cable ties fixing the components
  • 1 × improved version additionally: 3mm IR LED 940nm illuminate the eye (caution, no powerful LED!)

View all 10 components

  • Advanced version using openCV and Python3

    Sergej Stoetzer04/23/2023 at 19:12 0 comments

    If you plan to use this project for secondary school students, you can ask them to do the video analysis in Python3 with the help of openCV. There are some fascinating tutorials out there explaining all the needed steps.

    Usually, it's getting the image frame, convert to grayscale, using a filter to reduce noise, then running the analysis. You can either convert further to black and white and crop the image, so the only moving black spot is the iris, or you can look for circular shapes, and then check if their size is in the region that makes sense for an iris.

    The parameters of black-white conversion or size of circles to look for, depends on tests you have to do yourself, using your build version...

    Here's s short screenshot with different outputs from the sample code (check github link):

    The sample code prints the x/y coordinates of the circle(s) found to the console for further inspection. Congratulations, your students now have successfully reduced complex pictures to x/y coordinates of the eye moving! From here they can easily build own applications for the eye tracker!

  • Short demo of blockbased eyetracking - controlling an mBot2

    Sergej Stoetzer04/23/2023 at 18:54 0 comments

    Here is a short demo of the code... it uses the slower "live mode" of the software, so the reaction of the robot is not immediate.
    Uploadmode (program runs on robot) and communication back to the computer (running the image processing) is not possible for all robots, that's why the slower mode is used here...

  • what's next: primary school in Austria - workshop in May

    Sergej Stoetzer04/23/2023 at 13:28 1 comment

    On May 23rd I will be visiting a primary school in Austria, and we will build several eyetrackers with 3rd grade students. These will be simple version ones, though, and we plan to build and do a first coding within 60minutes. The school is very much focussing on topics of medical support and "medical engineering", students are using 3d-printer and even laser cutter with assistance of their teachers.

    I will publish some photos and maybe a short video here... A first short demo was successful in a secondary school in Berlin. But now I don't have an endoscopic camera left, so all images are taken by the "advanced" version of the eyetracker.

View all 3 project logs

  • 1
    Simple Eyetracker (primary schools)

    The simple version of the eyetracker just uses the endoscopic camera and attaches it to the frame of the safety goggles - no soldering or 3D-printing required. This can be done in small groups of students (2-3) within 45minutes, once the materials are prepared:

    The copper wire (2.5 mm2) is taken from installation equipment, any hardware store should have a similar selection (here 3 wires are contained, different colors - they don't matter here, since we don't connect any electrical component - it hales saving on the material if we use all of them).

    The next step involved a computer to show the live camera feed and find the optimal position of the camera in front of the eye. Since the camera is round, it is also a good idea to mark the top of the camera when the eye is oriented correctly in the camera view - in the picture this is done with a little orange sticker.
    Make sure the entire eye is clearly visible, not only the iris. Since cheap cameras have fixed focus, we need to adjust the position sideways (distance to the nose) and the height as well - distance to the eyebrow. For both, we can use the google frame as a reference point:

    In the 3rd step, we reproduce this position with the flexible wire and bend it, so it will roughly be in the correct position, once the camera is fixed to it. The wire could easily be 30cm long, we cut the excess off at a later stage.

    Now it is time for cable-ties - use two of them to fix the camera with the marking at the top to the wire - do not fully tighten it now!

    Then fix the flexible cable together with the USB cable to the frame. Leave the cable-ties as they are - once tested, they will be fixed and cut.

    Now ask the pupils to put on the safety frame with the camera attached and pay attention to the cable-ties and the eyes ;-)
    This is the step to check for proper alignment and move the camera by bending the wire. Small adjustments can be done later at any time, but we need to get the rough orientation and dimensions right. Use the computer to check the camera feedback...

    Once everything looks good it is time to tighten the zip-ties using pliers and cut the excess of the flexible copper cable (acting as a support). If you are working with groups of students, make sure they cut the correct cable - not the USB one ;-)

    And this is the result - a simple construction suitable for primary school projects using an endoscope camera and a safety goggles frame with some support by a copper wire:

  • 2
    Using the Eyetracker (simple version) - blockbased coding like Scratch3

    Scratch is a graphical programming language that is ideal for starting with young students, especially in primary school and early secondary school. The programming language is being developed by MIT and will be released in its third version (Scratch 3) at the beginning of 2019. Additional functions can be implemented via extensions, e.g. for processing voice commands or images.

    The robotics manufacturer Makeblock offers a programming environment based on Scratch3, into which machine vision can be integrated very easily. The software can be installed free of charge at the following link:

    https://www.makeblock.com/software/mblock5

    Then add the extension for training neural networks ("Teachable Machine") - use the Sprites part of the coding area:

    This will add image categorization into your sprite coding categories:

    By default, Teachable Machine offers 3 categories. You add images from a webcam to each category for learning. While you can always add more pictures, the number of categories cannot be changed later, unless a completely new model is built. To differentiate for age groups or to allow students to progress at their own pace (scaffolding) you could also start with just 3 categories - have a robot driving straight on and only change left/right turns by eye movements.

    For a full directional control we need the 4 direction categories "up", "down", "left" and "right" - and also a category where the eye just looks straight on. Otherwise you "force" the model to always decide for one direction, even if the wearer of the eyetracker does not indicate a direction at all...

    With the images being added, the model is built immediately incorporating them and getting precise with more pictures; 15-30 per category are a good indication. Once the file is saved, closed and opened again, you will no longer see the images - they are not kept, only the model generated from them. This all happens locally, no cloud-based image processing is taking place (GDPR-"safe" for schools).

    If the advanced eye tracker is mounted on the right side, the image of the camera is rotated by 180° as seen in the figure above. Here this does not matter, because the training algorithm processes the images themselves, regardless of the position of the eye. For the evaluation with Python and openCV, the image can simply be rotated.

    To control the robot, use messages between sprites and the robot programming area:

    In scratch programming, only a section is shown because the structure is repeated:

    If a direction is recognized, an arrow in the corresponding direction is shown as a sprite (costume). This gives the user visual feedback as to which direction of movement has been detected. If the space bar is then pressed while a direction is detected, the information about the direction of gaze is broadcast to other parts of the program - e.g. for the direction of travel of a robot. If the user is looking straight ahead, nothing is displayed and a stop command is transmitted. We need a confirmation of the direction because the gaze also moves unintendedly and the space bar on the keyboard is the biggest button and easy to hit ;-)
    It takes a bit of practise to intentionally move one eye into the desired direction while also checking the arrow on the screen for the visual feedback (ideally use peripheric vision). So for students or young pupils, it might be easiest to start with left/right commands only...

    The robot-part of the code just receives the broadcasts and executed different driving commands:

    Again, this picture only shows a section of the code... the example file has code for 3 different robots...

  • 3
    Advanced version - IR illumination and own camera case​

    If you like, you can build an advanced version of the eyetracker. Professional eyetracker illuminates the eye with IR light and checks for reflection on the iris or takes the darkest spot (iris) - depending on the position of the tracker.

    BE CAREFUL WITH IR ILLUMINATION. We cannot detect IR light and if too strong, it could cause damage to the retina. Only use extremely low-power LEDs and increase the resistor so that the LED just emits enough light.

    This is the overview of materials needed: a 3mm IR LED with a resistor, support wire a small heatsink (ideally), and a shrinking tube.

    Using pliers, carefully remove the cylindrical housing of the endoscope camera. The camera module rests vertically on the circuit board at the front: 

    Carefully detach the small camera module from the circuit board (hot glue holds it in place). Do not damage the thin conductor tracks. For installation in the 3D-printed housing, the hotglue drop must be removed with a cutter. There is a tiny circular circuit board on the camera module with the white LEDs for illumination (fracking away from the observer in the picture) - please remove this and cut the conductive tracks.

    Unscrew the lens from the camera module - the IR cut filter has a reddish shimmer. We need to remove this filer, it blocks nearly all IR light:

    Using a cutter, carefully remove the IR cut filter but leave the lens intact ;-)

    Create a 3D housing and print it, and adjust the cut-out for the heat sink if necessary (the PCB has a silver surface that gets quite warm. To be on the safe side, it should be dissipated by a heat sink). Stick a piece of exposed film onto the camera opening of the 3D-printed lid.

    Since there are many different endoscopic cameras, any 3d-files provided would probably not match... Since this is a school project, the students could easily creat the housing themselves :-)

    Glue the IR-LED into the lid, and connect the terminals of the LED with the cables powering the previous camera light - in my example, they were blue = cathode, yellow = anode. They can be desoldered from the circuit board and only be used for the IR LED with a resistor.
    Then glue everything in the housing. Next to the connection cable, glue in a wire for stabilization:

    This is the closed 3d-printed case for the eyetracker. The supporting wire and USB-cable are enclosed in a heat-shrinking tube. The IR-LED is on, barely visible to a normal smartphone camera. The endoscopic camera is covered by exposed analog film, acting as a blocking filter for visible light, but letting IR pass.

    This is the finalized version of the eyetracker:

    If you check the IR illumination with a IR camera, you can see that even low-power LEDs emit quite a powerful light beam:

View all 4 instructions

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates