Low Cost Open Source Eye Tracking

A system designed to be easy to build and setup. Uses two $20 webcams and open source software to provide accurate eye tracking

Similar projects worth following
The purpose of this project is to convey a location in 3 dimensional space to a machine, hands free and in real time.

Currently it is very difficult to control machines without making the user provide input with their hands. Additionally it can be very difficult to specify a location in space without a complex input device. This system provides a novel solution to this problem by allowing the user to specify a location simply by looking at it.

Normally eyetracking solutions are prohibitively expensive and not open source, limiting their use for creators to integrate them into new projects. This solution is fully open source, easy to build and will provide a huge variety of options for makers interested in using this fascinating and powerful technology.

The project software repository is located on Github here:  The Jevons Camera Viewer

Please see the github repository for all design files. This is a working prototype. 

Please see Eye_Tracker_Software_Quick_Start_Guide.pdf in the files to get started with the software.

The project is also capable of processing video in real-time for optical flow. 

It can identify shapes such as circles, lines, and triangles. 

The c270 can be exposure locked. 

The image can be posterized for blob detection. 

Multiple cameras can be displayed at the same time. 

The video feeds can be mirrored and rotated in realtime

The video can be zoomed

Basic edge detection can be utilized

Real-time Human Machine Interface:

This system accomplishes this by using a set of fixed-point laser beacons on a working surface and two cameras. One camera looks at the users eye and another camera looks forward to see what the user is looking at. The eye camera operates by capturing an infrared reflection off of the users eye, calculating the direction of the human users gaze, then defining a "looking vector." The forward camera recognizes the fixed-point beacons on the working surface, measures the pixel distances between the beacons in each video frame, and defines the position of the user relative to the working surface. Using the information from both cameras the system will map the "looking vector" onto the working surface, thus communicating the users desired location in space to a machine simply by looking.

A flow diagram showing how all the different parts of the system work together is shown below.

Future Plans:

Calibration improvements:

Red Lasers with masking tape to diffuse the beam have been used for location of the surface onto which the user's gaze will be mapped. This will greatly improve calibration and allow the user's gaze to be fed into the computer as a user input. Any surface can be utilized, including very large production floors or warehouses.

The calibration is going to be converted into a workflow that involves the user looking at a moving dot. Bad calibration values will be automatically removed based on a statistical algorithm that has yet to be implemented, but will involve basic rules about how far the calibration point should be from the last calibration point and in which direction. A calibration point that is more than twice the distance from the last calibration point will be removed (subsequent input will be interpolated, or the user will be asked to repeat calibration), furthermore any calibration point that is in the opposite direction from the last point will be removed.

Eye Tracking Logging and Analysis:

Currently no eye tracking data is being logged. A custom build logging library has been added to address this. Video will also need to be recorded in order to map the user's gaze onto a specific image or video for further analysis. This means a timecode will need to be recorded with the video to be mapped to a file created by the system containing the user's eye gaze. An R script may be written to provide heat maps in the case of static UI analysis.

Machine Addressing and Control:

The end goal of this project is to be able to interact with a machine to direct it's end affector (or machine itself) to a location. This is currently an extremely difficult problem to solve hands free. By doing this the user will be able to direct a machine without taking their hands off their current task.

Currently the proof of concept revolves around moving a modified pick and place machine to pick up and move blocks based on a user's eye gaze. Please see the branch Machine Control on github for more information.


This project uses an Apache License Version 2.0 (APLv2) for all software and hardware. This...

Read more »


Brief overview of how to get the software up and running.

Adobe Portable Document Format - 2.53 MB - 08/27/2018 at 13:41



APLv2 license for all software and hardware components

plain - 11.27 kB - 08/26/2018 at 23:24


View all 12 components

  • Setting Up the Fixed-Point Laser Beacons

    Matthew Barker08/26/2018 at 22:54 0 comments

    A few different light sources were tried before settling on lasers as the best option. These lasers are simple 3 volt red dot lasers that have been diffused with a piece of masking tape. The lasers were bright enough to be easily detected by the forward camera even in a brightly lit room.

    Below is a screen shot showing what the forward camera is looking at (top half) and the reflection of the IR led on the eye (bottom half)

    In order to recognize the beacons in software we apply some filters using Jevons Camera Veiwer software.

  • Mounting and Structural Changes

    Kirsten Olsen08/26/2018 at 15:45 0 comments

    To better stabilize the forward-facing camera, we utilized a carrier apparatus that is mounted onto the top of the glasses using screws. We chose to use a wood carrier apparatus, although plastic would be a suitable material as well. This change eliminated the zip tie mounting.

    We changed the eye-tracking camera's mounting to a through-hole mounting, drilled directly into the glasses. The new mounting increased the resolution of the eye-tracking camera, since the lens is closer to the eye and the image is no longer distorted by the plastic lens of the safety glasses. The blue Loctite Fun Tak mounting putty holds the camera more securely.

    For additional usability, we changed the adjustable strap in the back of the glasses to a shock cord strap. 

  • Converting to an Infrared Eye-Tracking Camera

    Kirsten Olsen08/25/2018 at 20:59 0 comments

    Previously, the eye tracking glasses used a green LED for tracking eye movement. Noise from other light sources was interfering with the system. Plus, the green light shining in the user's eye was distracting. 

    Trading the green LED for an IR LED solves both of these problems. To fully convert the system to IR tracking, we inserted an IR film to filter the eye tracking camera. Because the c270 camera is not very sensitive to IR light, we more than doubled the brightness of the IR LED by swapping the 200 Ohm chip resistor for a 75 Ohm chip resistor. 

  • Looking into designing and manufacturing Camera PCB

    John Evans05/13/2018 at 18:54 0 comments

    2018-05-13 So after a very exhaustive search of USB capable image sensors I've pretty much given up.

    Things I've looked into to date:

    So I've been very impressed with the On-Semi MIPI CSI-2 interface sensors such as the MT9M114. My original plan was to design a tiny circuit board around on of their sensors. But after a very exhaustive search I was unable to find anything that can convert that data to motion-JPEG and put it on the USB. There is a very large and expensive chip made by cypress called the Cypress EZ-USB® CX3 which looks very cool, but is way more than this project requires.

    Additionally I looked into several 12 bit parallel interface sensors such as the AR0130CSSC00SPBA0-DP1, with the plan being to use a realtec usb 2.0 camera controller. Unfortunately I do not believe these are available to the public and I do no want to invest the time into a design with almost no support. It's difficult to find datasheets for these devices. An example being the RTS5822 if anyone has additional information about these devices, please let me know.

    I have looked into using a raspberry pi zero as an IP camera with a MIPI camera, such as the MT9M114 and a tiny pcb. Unfortunately the IP stream appears as though it may be delayed by several milliseconds. However I need to do some experimentation to confirm this. (The latency maybe extremely low in real terms, just requiring some optimization)

    What I'd really like to do is set up a raspberry pi zero as a USB camera device per this guide and stream the data from a MIPI camera through the USB:

    If anyone would like to take on this project I'd love to help out. But in the meantime I believe we're going to be stuck with the c270

View all 4 project logs

  • 1
    Building the Eye Tracker Glasses
  • 2
    Exposing PCB from camera housing

    You'll need two cameras. Unbox them.

    Remove the front bezel and unscrew the three retaining screws holding on the front face of the camera housing.

    There are also two screws holding the PCB to the back housing.

    Once the PCB is free from the housing use a pair of tin snips or wire clippers to cut the housing away from the USB cord. Be careful not to damage the USB cord.

    Once both PCB's are removed from the housings each camera will need some modifications before mounting onto the glasses.

  • 3
    Front Facing Camera Modifications

    Before the front camera can be used it'll need to be modified with a different lens and lens mount.

    You can 3d print the lens mount. The STL file can be found here:

    The lens can be purchased here:

    Two screws hold the mount onto the PCB and are located on the back of the PCB.

View all 7 instructions

Enjoy this project?



Sultan wrote 6 days ago point

Hello, Pretty interesting project. I just did not find the "get started" pdf file within the Github directory of yours. Could you please advice ?

  Are you sure? yes | no

John Evans wrote a day ago point

sorry, it is located under the files section of this page. I will add it to the github shortly.

  Are you sure? yes | no

ilker sahin wrote 08/05/2018 at 11:44 point

Dear John, as I sent you private message.....all we need is a free software -CAPABLE OF accurate eye tracking and DATA ANALYSIS ..... I made  head-mounted eye tracking hardware (head-mounted IR device with hacked PS3 cam) which is composed of aN INFRARED ILLUMUNATED PS3 EYE CAMERA .....but I couldnt find well performing free data analysing Software...I tried OGAMA/ITU GAZE TRACKER Software but results were not satisfactory, accurate and reliable.....Some says Pygaze is good but it  requires basic-good knowledge of Phyton.......I LOOK FORWARD TO SEEING JASONS SOFTWARE WHICH IS ABLE TO ANALYSE DATA AND OPTIONS FOR DATA ANALYSIS LIKE IN FREE OGAMA SOFTWARE...SO THAT WE CAN USE YOUR SOFTWARE FOR NEUROSCIENCE  RESEARCHES .....

  Are you sure? yes | no

John Evans wrote 08/13/2018 at 13:47 point

Thanks for your interest in the project! My time has been getting sucked up by a lot of contracts lately. I've set aside from now until the end of the month to work on this project so please be on the look out for updates.

  Are you sure? yes | no

ilker sahin wrote 08/13/2018 at 14:02 point

There is a free open source software called Ogama created by a German professor...But he gave it UP and No developments or Updates since 2015.. Ogama software include a lot of eye tracking sub commercial and sub noncommercial software which collect the tracking data such As Tobii, ITU gaze, Haytham Tracker... After selecting any of these eye tracking starts accompannied by a Swift calibration... later on Ogama can analyse the data with the help of data Analysing tools which uses any of the sub eye tracking software... Is it possible to integrate Your software into Ogama?? . So that Analysis become easier and practical... I suggest you to download free open source Ogama.. There must be a way to include Yours.. I tried non commercial software include in Ogama but that ones didnt put forward satisfactory results the results were not ver accurate to rely on as a Researcher.. Maybe it stems from the fact that they are not completed not updated projects..someone did frreware and gave them up. Or ı really Wonder and looking forward to seeing if your software would be able analyse and collect pretty accurate eye tracking data and provided by IR illumunated head mounted monocular eye tracking camera. (mine uses 1 camera to track only one eye with the help of infrared lights.. I removed the filter) 

  Are you sure? yes | no

Daren Schwenke wrote 05/05/2018 at 20:12 point

You could do this without obstructing the users view by using an IR camera and and IR coated microscope slide or first surface mirror at a 45 degree angle.  I would say gut the IR filter out of a webcam but it's too small.

  Are you sure? yes | no

John Evans wrote 05/06/2018 at 01:24 point

You are correct about the ir filter I have read that it is integrated directly into the sensor, making removal Impossible. I'm working on a redesign of the glasses, The frames will be 3d printed. Your recommendation to use an IR coated mirror is excellent, thanks, I will likely continue with that. 

  Are you sure? yes | no

Tom Meehan wrote 04/24/2018 at 04:16 point

This is an amazing project, I've looked at vision/eye tracking applications for a while and yours is great.  Thanks for posting this project.

  Are you sure? yes | no

John Evans wrote 04/25/2018 at 03:22 point

Hi Tom, thank you very much. This is my first open source project so your encouragement is highly appreciated. It has been a lot of fun so far.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates