Close
0%
0%

Low Cost Open Source Eye Tracking

A system designed to be easy to build and setup. Uses two $20 webcams and open source software to provide accurate eye tracking

Similar projects worth following
The purpose of this project is to convey a location in 3 dimensional space to a machine, hands free and in real time.

Currently it is very difficult to control machines without making the user provide input with their hands. Additionally it can be very difficult to specify a location in space without a complex input device. This system provides a novel solution to this problem by allowing the user to specify a location simply by looking at it.

Normally eyetracking solutions are prohibitively expensive and not open source, limiting their use for creators to integrate them into new projects. This solution is fully open source, easy to build and will provide a huge variety of options for makers interested in using this fascinating and powerful technology.

5 minute video:

The project software repository is located on Github here:  The Jevons Camera Viewer

Please see the github repository for all design files. This is a working prototype. 

Please see Eye_Tracker_Software_Quick_Start_Guide.pdf in the files to get started with the software.

The project is also capable of processing video in real-time for optical flow. 

It can identify shapes such as circles, lines, and triangles. 

The c270 can be exposure locked. 

The image can be posterized for blob detection. 

Multiple cameras can be displayed at the same time. 

The video feeds can be mirrored and rotated in realtime

The video can be zoomed

Basic edge detection can be utilized

Real-time Human Machine Interface:

This system accomplishes this by using a set of fixed-point laser beacons on a working surface and two cameras. One camera looks at the users eye and another camera looks forward to see what the user is looking at. The eye camera operates by capturing an infrared reflection off of the users eye, calculating the direction of the human users gaze, then defining a "looking vector." The forward camera recognizes the fixed-point beacons on the working surface, measures the pixel distances between the beacons in each video frame, and defines the position of the user relative to the working surface. Using the information from both cameras the system will map the "looking vector" onto the working surface, thus communicating the users desired location in space to a machine simply by looking.

A flow diagram showing how all the different parts of the system work together is shown below.

Future Plans:

Calibration improvements:

Red Lasers with masking tape to diffuse the beam have been used for location of the surface onto which the user's gaze will be mapped. This will greatly improve calibration and allow the user's gaze to be fed into the computer as a user input. Any surface can be utilized, including very large production floors or warehouses.

The calibration is going to be converted into a workflow that involves the user looking at a moving dot. Bad calibration values will be automatically removed based on a statistical algorithm that has yet to be implemented, but will involve basic rules about how far the calibration point should be from the last calibration point and in which direction. A calibration point that is more than twice the distance from the last calibration point will be removed (subsequent input will be interpolated, or the user will be asked to repeat calibration), furthermore any calibration point that is in the opposite direction from the last point will be removed.

Eye Tracking Logging and Analysis:

Currently no eye tracking data is being logged. A custom build logging library has been added to address this. Video will also need to be recorded in order to map the user's gaze onto a specific image or video for further analysis. This means a timecode will need to be recorded with the video to be mapped to a file created by the system containing the user's eye gaze. An R script may be written to provide heat maps in the case of static UI analysis.

Machine Addressing and Control:

The end goal of this project is to be able to interact with a machine to direct it's end affector (or machine itself) to a location. This is currently an extremely difficult problem to solve hands free. By doing this the user will be able to direct a machine without taking their hands off their current task.

Currently the proof of concept revolves around moving a modified pick and place machine to pick up and move blocks based on a user's eye gaze. Please see the branch Machine Control on github for more information.

Licensing:

This project uses an Apache...

Read more »

Eye_Tracker_Software_Quick_Start_Guide.pdf

Brief overview of how to get the software up and running.

Adobe Portable Document Format - 2.53 MB - 08/27/2018 at 13:41

Preview

LICENSE.txt

APLv2 license for all software and hardware components

plain - 11.27 kB - 08/26/2018 at 23:24

Download

View all 12 components

  • Good Luck to All Hackaday Prize Competitors

    John Evans10/22/2018 at 13:57 0 comments

    It's been a crazy last few months, I put in the time when I could. Starting a new job didn't help, but over all it's been good. Thanks to everyone who helped out, thanks to everyone who followed this project and commented and stared the github page. It's been a lot of fun!

  • Mouse Tracking working!

    John Evans10/22/2018 at 13:54 0 comments

    Just finished getting mouse tracking working, so you can control your mouse, however there's no time to put it into the video. I'll post photos and details soon. Check out the branch MachineControl on github for more details.

  • Timestamped Head Motion Corrected Eye Tracking Data Available

    John Evans10/22/2018 at 11:10 0 comments

    Finally a long asked for feature is available: time stamped eye tracking data that can be used for heat maps and other static image eye tracking analysis. Heat maps are a useful way to visualize data for websites and design as they can reveal the amount of attention a user pays to an individual element of the design and can help reveal issues.

    A casual search on google images for "eye tracking heat maps" reveals all sorts of interesting information regarding this technique. 

    For example: this image shows the effect of where the model looks in an advertisement. Notice how much more attention is paid to the product on the left image.

  • Laser Beacons Replaced with IR LEDs

    John Evans10/22/2018 at 10:56 0 comments

    In order to provide a reliable and non intrusive experience to the user, the beacons used for the ransac approximation were switched from a red laser diode based design, to IR LED's at 940nm. 

    Additionally an IR filter was added to the front facing camera. Changing the front camera output from a typical camera output, to a tool to be used for localization and mapping. 

  • Ransac Homography Correction Success!

    John Evans10/22/2018 at 02:33 0 comments

    Today we were successful in correcting for head movement relative to the screen. This was accomplished by using Accord CV's ransac implementation.

    Homography is a powerful tool because it is relatively light on the CPU but can be used to do incredible things like estimating movement and perspective. And in this case, remapping points after the camera has moved.

    This technique is what is powering the ability to map the computer mouse to the user's eyes. 

    All of this new functionality is available on the MachineControl branch: https://github.com/J-east/JevonsCameraViewer/tree/machineControl

  • Development Plans over the next few days:

    John Evans10/18/2018 at 22:07 0 comments

    TCP/IP sockets: added TCP socket project for communicating with different applications over the network. Doing this will allow great flexibility in terms of communicating with different systems that may be wireless.

    Greater separation of Projects: A lot of work has been done to separate out the eyetracking logic from the operating system specific machine vision. Camera and machine vision logic is held in one project (.exe) while EyeTracking, TCP Sockets, and Logging are held in different DLLs. This should allow for great flexibility to eventually expand this project to different systems.

    Perspective Transformation: A big goal has been to move away from a monitor based system, this will be used to control various machines, such as robotic arms via systems such as the tinyG stepper motor controller system. Other platforms may be used, as long as they are able to attach to a TCP socket on the network, and interpret the commands sent.

    By introducing perspective transformation, known fixed points may be identified with the front facing camera and the shift will be used to generate a transformation matrix. This transformation matrix will be used to transform the eye tracking location generated in order to correct for head movement (Yaw/Pitch/Row) as well as (X/Y/Z Cartesian movement relative to the known points. IR will be used on the front facing camera in order to ease the identification of these points.

    Decawave Time of Flight Location System:  Matt will be integrating Decawave based location system for greater understanding of head location in Cartesian space. We're hoping for +/- 50mm of resolution.

    BNO055 9 axis Accelerometer: Matt will also be integrating a BNO055 for better understanding of Yaw/Pitch/Roll of the user's head. This may be added to a sort of rudimentary data fusion model for correction of the fix point beacon system we are trying for with the front facing camera and IR light sources. The team has fairly limited understanding of homography and more advanced machine vision techniques, so this may end up being essential for properly calibrating the user's gaze relative to the machine that will be interfaced with

    TinyG Based Machine Control: Another big goal was to interface with a tinyG based CNC system. End goal being to draw a picture using the user's gaze.

    More Comfortable Head Gear: My 3d printer is back up and running so an attempt will be made to print a superior head mounted unit for carrying the cameras and additional hardware which will be used in this project. 

    MVP: allow user to interface with computer screen for better ease of use: The minimum we can hope for is to allow the user to type a few words using only their eyes. This could be a massive feat should it be accomplished in a way that is easy to use for the individual. It's unclear what this interface may eventually look like, but it should be 1: accurate, 2: low eye strain, 3: easy to learn how to use

  • Setting Up the Fixed-Point Laser Beacons

    Matthew Barker08/26/2018 at 22:54 0 comments

    A few different light sources were tried before settling on lasers as the best option. These lasers are simple 3 volt red dot lasers that have been diffused with a piece of masking tape. The lasers were bright enough to be easily detected by the forward camera even in a brightly lit room.

    Below is a screen shot showing what the forward camera is looking at (top half) and the reflection of the IR led on the eye (bottom half)

    In order to recognize the beacons in software we apply some filters using Jevons Camera Veiwer software.

  • Mounting and Structural Changes

    Kirsten Olsen08/26/2018 at 15:45 0 comments

    To better stabilize the forward-facing camera, we utilized a carrier apparatus that is mounted onto the top of the glasses using screws. We chose to use a wood carrier apparatus, although plastic would be a suitable material as well. This change eliminated the zip tie mounting.

    We changed the eye-tracking camera's mounting to a through-hole mounting, drilled directly into the glasses. The new mounting increased the resolution of the eye-tracking camera, since the lens is closer to the eye and the image is no longer distorted by the plastic lens of the safety glasses. The blue Loctite Fun Tak mounting putty holds the camera more securely.

    For additional usability, we changed the adjustable strap in the back of the glasses to a shock cord strap. 

  • Converting to an Infrared Eye-Tracking Camera

    Kirsten Olsen08/25/2018 at 20:59 0 comments

    Previously, the eye tracking glasses used a green LED for tracking eye movement. Noise from other light sources was interfering with the system. Plus, the green light shining in the user's eye was distracting. 

    Trading the green LED for an IR LED solves both of these problems. To fully convert the system to IR tracking, we inserted an IR film to filter the eye tracking camera. Because the c270 camera is not very sensitive to IR light, we more than doubled the brightness of the IR LED by swapping the 200 Ohm chip resistor for a 75 Ohm chip resistor. 

  • Looking into designing and manufacturing Camera PCB

    John Evans05/13/2018 at 18:54 0 comments

    2018-05-13 So after a very exhaustive search of USB capable image sensors I've pretty much given up.

    Things I've looked into to date:

    So I've been very impressed with the On-Semi MIPI CSI-2 interface sensors such as the MT9M114. My original plan was to design a tiny circuit board around on of their sensors. But after a very exhaustive search I was unable to find anything that can convert that data to motion-JPEG and put it on the USB. There is a very large and expensive chip made by cypress called the Cypress EZ-USB® CX3 which looks very cool, but is way more than this project requires.

    Additionally I looked into several 12 bit parallel interface sensors such as the AR0130CSSC00SPBA0-DP1, with the plan being to use a realtec usb 2.0 camera controller. Unfortunately I do not believe these are available to the public and I do no want to invest the time into a design with almost no support. It's difficult to find datasheets for these devices. An example being the RTS5822 if anyone has additional information about these devices, please let me know.

    I have looked into using a raspberry pi zero as an IP camera with a MIPI camera, such as the MT9M114 and a tiny pcb. Unfortunately the IP stream appears as though it may be delayed by several milliseconds. However I need to do some experimentation to confirm this. (The latency maybe extremely low in real terms, just requiring some optimization)

    What I'd really like to do is set up a raspberry pi zero as a USB camera device per this guide and stream the data from a MIPI camera through the USB: https://learn.adafruit.com/turning-your-raspberry-pi-zero-into-a-usb-gadget?view=all#other-modules

    If anyone would like to take on this project I'd love to help out. But in the meantime I believe we're going to be stuck with the c270

View all 10 project logs

  • 1
    Building the Eye Tracker Glasses
  • 2
    Exposing PCB from camera housing

    You'll need two cameras. Unbox them.

    Remove the front bezel and unscrew the three retaining screws holding on the front face of the camera housing.

    There are also two screws holding the PCB to the back housing.

    Once the PCB is free from the housing use a pair of tin snips or wire clippers to cut the housing away from the USB cord. Be careful not to damage the USB cord.

    Once both PCB's are removed from the housings each camera will need some modifications before mounting onto the glasses.

  • 3
    Front Facing Camera Modifications

    Before the front camera can be used it'll need to be modified with a different lens and lens mount.

    You can 3d print the lens mount. The STL file can be found here:  https://goo.gl/PNbTHv

    The lens can be purchased here: https://goo.gl/cxPK4H

    Two screws hold the mount onto the PCB and are located on the back of the PCB.

View all 7 instructions

Enjoy this project?

Share

Discussions

Wenkain wrote 05/21/2024 at 01:37 point

Dear John,thank you for your job. By reading the "get started" pdf file, I can run the code and access the camera viewer, but every time I attempt to select a USB camera and click 'go',I receive a message that my camera is being used in another process. I have reset and fiddled with my camera settings several times. But there is still the problem. I wonder how I can solve the problem.

  Are you sure? yes | no

Felipe Concha wrote 10/03/2022 at 21:05 point

Dear John, thank you very much for this project. I am enjoying it a lot. I have some troubles with the eye camera's changes. As someone said above, the camera has an integrated IR filter that blocks the IR light. So, I found a video explaining how to gently "destroy" the glass filter, which worked well for me. I copy the link for someone that could need it https://www.youtube.com/watch?v=ihuHuC12328&t=562s

It's been a while since your last update, but I am expecting the 3d frames that you were designing. I am working on that too, and I hope I can finish it soon and share it through this site

  Are you sure? yes | no

Sultan wrote 09/14/2018 at 11:33 point

Hello, Pretty interesting project. I just did not find the "get started" pdf file within the Github directory of yours. Could you please advice ?

  Are you sure? yes | no

John Evans wrote 09/18/2018 at 21:50 point

sorry, it is located under the files section of this page. I will add it to the github shortly.

  Are you sure? yes | no

ilker sahin wrote 08/05/2018 at 11:44 point

Dear John, as I sent you private message.....all we need is a free software -CAPABLE OF accurate eye tracking and DATA ANALYSIS ..... I made  head-mounted eye tracking hardware (head-mounted IR device with hacked PS3 cam) which is composed of aN INFRARED ILLUMUNATED PS3 EYE CAMERA .....but I couldnt find well performing free data analysing Software...I tried OGAMA/ITU GAZE TRACKER Software but results were not satisfactory, accurate and reliable.....Some says Pygaze is good but it  requires basic-good knowledge of Phyton.......I LOOK FORWARD TO SEEING JASONS SOFTWARE WHICH IS ABLE TO ANALYSE DATA AND OPTIONS FOR DATA ANALYSIS LIKE IN FREE OGAMA SOFTWARE...SO THAT WE CAN USE YOUR SOFTWARE FOR NEUROSCIENCE  RESEARCHES .....

  Are you sure? yes | no

John Evans wrote 08/13/2018 at 13:47 point

Thanks for your interest in the project! My time has been getting sucked up by a lot of contracts lately. I've set aside from now until the end of the month to work on this project so please be on the look out for updates.

  Are you sure? yes | no

ilker sahin wrote 08/13/2018 at 14:02 point

There is a free open source software called Ogama created by a German professor...But he gave it UP and No developments or Updates since 2015.. Ogama software include a lot of eye tracking sub commercial and sub noncommercial software which collect the tracking data such As Tobii, ITU gaze, Haytham Tracker... After selecting any of these eye tracking starts accompannied by a Swift calibration... later on Ogama can analyse the data with the help of data Analysing tools which uses any of the sub eye tracking software... Is it possible to integrate Your software into Ogama?? . So that Analysis become easier and practical... I suggest you to download free open source Ogama.. There must be a way to include Yours.. I tried non commercial software include in Ogama but that ones didnt put forward satisfactory results the results were not ver accurate to rely on as a Researcher.. Maybe it stems from the fact that they are not completed not updated projects..someone did frreware and gave them up. Or ı really Wonder and looking forward to seeing if your software would be able analyse and collect pretty accurate eye tracking data and provided by IR illumunated head mounted monocular eye tracking camera. (mine uses 1 camera to track only one eye with the help of infrared lights.. I removed the filter) 

  Are you sure? yes | no

Daren Schwenke wrote 05/05/2018 at 20:12 point

You could do this without obstructing the users view by using an IR camera and and IR coated microscope slide or first surface mirror at a 45 degree angle.  I would say gut the IR filter out of a webcam but it's too small.

  Are you sure? yes | no

John Evans wrote 05/06/2018 at 01:24 point

You are correct about the ir filter I have read that it is integrated directly into the sensor, making removal Impossible. I'm working on a redesign of the glasses, The frames will be 3d printed. Your recommendation to use an IR coated mirror is excellent, thanks, I will likely continue with that. 

  Are you sure? yes | no

Tom Meehan wrote 04/24/2018 at 04:16 point

This is an amazing project, I've looked at vision/eye tracking applications for a while and yours is great.  Thanks for posting this project.

  Are you sure? yes | no

John Evans wrote 04/25/2018 at 03:22 point

Hi Tom, thank you very much. This is my first open source project so your encouragement is highly appreciated. It has been a lot of fun so far.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates