Close
0%
0%

Gestum Glove

Multipurpose, low cost hand tracking.

Similar projects worth following
189 views
Gestum is a platform to provide gesture input for computers, and assistive tech for humans.

What

This project is the end result of a task to create an open source gesture input toolkit for my lab while I was an undergraduate research aide in college. I still hack on it in my free time, but now with the goal of bringing gesture input and niche assistive use cases together.

The aim is to use orientation sensors in tracking individual fingers on a hand. The resulting system -- worn as a glove -- can then be used to interpret gestures, recognize sign language, type with one hand, or track patient progress in physical therapy.

Why

Recent developments in computer hardware and software show that the "next big thing" in computing is going to be a shakedown of how users interact with their devices. This is evident in Apple, Google, and Microsoft's rabid push of voice recognition systems in phones and laptops. Tracking progress forward, gesture input may soon be a staple of computer interaction.

Existing solutions seem to be pretty limited. Motion capture suits often leave out fine hand detail as their goal is to record broad character movements for video animation. Camera based tracking is expensive and area constrained. Most VR gesture tracking is too blunt.

There are many other similar project, but they tend to target a specific use case and wind up being bespoke hardware, bespoke firmware, and custom data transmission that speaks to a single desktop application. I want to make a common platform that can then be used to meet many end use cases, hopefully providing a robust and cost effective platform for more gesture hacking projects and even real world use.

Regardless, sometimes it feels like I'm just one more hamster on the wheel:

How

A human factors lab at my local college is investigating gesture input methods for computing. I want to build assistive tools. By combining use cases I hope to develop something with staying power. Similar to how GPUs are great for medical research, but the GPU industry's money is from video games. If Gestum can be good at common gesture input, then that user base may help make the system more available for other applications.

Who

Well there's myself, and some amazing folks over at H2I (Human Interface Innovation Lab) in a nearby college. The lab is working on using "natural" gestures as input to computers. During my undergrad I worked for H2I on their open source kit for this gesture input, and now that I've left I still contribute to the project as a community member because I'm interested in using the same tech for tracking physical therapy progress and interpreting sign language.

Other Projects

There are several other open source projects with similar goals. Originally I just wanted to do sign language interpretation, but after reading about the other ways IMU gloves are being built for niche targets I want to see if one platform could meet many end uses.

Helpful Links

Information that I've found useful while working on this project.

Read more »

  • 1 × BNO055 IMU from Adafruit
  • 1 × TCA9548A Adafruit I2C mux
  • 1 × ATmega328P Microprocessors, Microcontrollers, DSPs / ARM, RISC-Based Microcontrollers
  • 1 × 4.7V LiPo battery
  • 1 × ESP-01

  • Wireless Sensor Test

    Christopher08/27/2017 at 17:22 0 comments

    It works! Quaternions are gathered from the BNO and sent back to a desktop application:

    Now the real work begins:

    • Need a second one!
      • I'm sharing this one with a local college, having two means myself and that group can work without stepping on toes.
    • Need to hook up more BNO055s and test with all of them being polled.
      • At least 3 initially for thumb, index, and middle fingers.
    • Write real desktop software to do user testing, not this repurposed Qt example stuff.
      • I need to learn more about the math involved. You can see in the gif that the rotations are being applied backwards.
    • Start work on the next PCB version!
      • Will use surface mount components!

    I've handed off the functioning PCB to H2I (Human Interfaces Innovation Lab) where they'll begin working to integrate it into gesture recognition tools.

  • Troubleshooting the First PCB

    Christopher08/27/2017 at 14:45 0 comments

    Back dated August 10th 2017

    So there are some bugs with the first assembled PCB. I'm glad now for the decision to make the PCB design as close to the breadboarded prototype as possible, because we have a reference to work with.

    It turns out I left a connection off of the ESP pins: CH_PD was left disconnected and needs to be pulled to Vcc for the ESP to work the way we want it. A small wire soldered in fixes that:


    But now that we can see the microcontroller serial output from the ESP, the ATmega boot loop is still there :-/

    Again it was pretty fast to find the issue: The PCB has a new ESP8266 on it, with stock esp-link firmware, the GPIO pin that controls our ATmega's reset line wasn't configured, and so it was creating spurious resets. With that config change, the resets stopped and we could finally start testing wireless transmit!

  • PCBs Arrive

    Christopher08/24/2017 at 04:37 0 comments

    Back dated July 24th 2017

    The OSHPark PCBs:

    Assembled:

    Testing it fails, as the ESP's AP doesn't show up and the ATmega is stuck in a boot-loop.

  • Designing a Prototype PCB

    Christopher08/23/2017 at 12:57 0 comments

    Back dated July 12th 2017

    The proposed PCB design:


    Once the breadboarded proof of concept was working, it was time to learn how to use KiCad. I'd never (successfully) designed a PCB layout before, so there was a lot of tutorials to read through and examples to follow before tackling this board.

    I named the PCB Opisthenar, which supposedly means "the back of the hand" in medical terminology; the current implementation includes strapping this board to the back of the tracking glove. The design was meant to be 1:1 against the breadboarded precursor to keep the layout simple (for my sake) and to avoid introducing unexpected bugs so we could go back and take a more narrow troubleshooting path for problems.

    Now that we have a nicer image of the circuit to look at, let's walk through the design. At the heart is an ATmega328P, which was fast to implement because I'm familiar with avr-libc, but will one day likely be absorbed by the ESP8266, which has a lot of leftover horsepower. The ATmega is connected to an I2C mux, and through the mux to the BNO055 orientation sensors. RX and TX on the ATmega goes to an ESP-01 which is running esp-link to make the serial connection wireless. And that's really about it, the design is simple and hopefully easy to implement for others to take and build cool projects on!

  • Breadboard prototype

    Christopher08/11/2017 at 03:21 0 comments

    Back dated June 30th 2017

    The first wireless setup:

    This is from back when I worked as a research assistant. We were investigating open source and accurate ways to get gesture input to a computer after seeing how simultaneously expensive and lacking commercial systems could be. One of the approaches was to try our hand at producing an open source gesture tracking system in-house, which I've forked to become Gestum after parting ways with the lab.

    A lot of time was spent chasing full 3D tracking with IMUs. It just didn't have a prayer with current consumer hardware. Once we'd shifted focus to orientation/rotation based tracking things started to move a little faster. The image above is of a BNO055 IMU reporting rotations to an ATmega, which then sends them over an ESP8266 with telnet back to a desktop.

View all 5 project logs

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates