Close
0%
0%

Gesture Recognition Wearable

Using Machine Learning to simplify Twitch Chat interface for my tvhead

Similar projects worth following
All files are in the Github repo.

Atltvhead Gesture Recognition Bracer - A Tensorflow gesture detector for the atltvhead project and an exploration into Data Science

This repository is my spin on Jennifer Wang's and Google Tensorflow's magic wand project.

Using accelerometer data and a CNN model, detect gestures during live streams. The gesture data is sent through Twitch chat to my tvhead project, changing what animations are available for the chat to control.

Machine learning model interplays with Twitch chat. Simplifying tvhead chat API.


I run an interactive live stream. I wear an old tv (with working led display) like a helmet and backpack with a display. Twitch chat controls what's displayed on the television screen and the backpack screen through chat commands. Together Twitch chat and I go through the city of Atlanta, Ga spreading cheer. 

As time has gone on, I have over 20 channel commands for the tv display. Remembering and even copy pasting all has become complicated and tedious. So it's time to simplify my interface to the tvhead.

 What are my resources?
During the live stream, I am on rollerblades, my right hand is holding the camera, my left hand has a high five detecting glove I've built from a time-of-flight sensor and esp32, my backpack has a raspberry pi 4, and a website with buttons that post commands in twitch chat. 

What to simplify?
I'd like to simplify the channel commands and gamify it a bit more.

 What resources to use?
I am going to change my high five gloves, removing the time-of-flight lidar sensor, and feed the raspberry pi with acceleration and gyroscope data. So that the Pi can inference a gesture performed from the arm data.

Here is my game plan!

  • 1 × ESP32 Thing Plus by Sparkfun
  • 1 × LSM6DSOX + LIS3MDL - Precision 9 DoF IMU - STEMMA QT / Qwiic by Adafruit
  • 1 × Push Button
  • 1 × Raspberry Pi 4

  • New Control Scheme

    nate.damen07/24/2020 at 19:58 0 comments

    Using both Gesture modes, and channel commands the following animations are broken up in equation style.
    Gesture A + command "1" = Animation A

View project log

  • 1
    Choosing a Sensor

    To briefly explain my sensor choices.

    Lidar: I've used "lidar" Time of flight sensors to detect high fives, in the previous version of my wearable. However, it cannot detect arm gestures without complex mounting a ton of them all over one arm.

    Stain Sensor: Using the change in resistance stretch rubbers I can get a sense of what muscles I'm actuating or general shape of my arm. However, they are easily damaged and wear with use.

    Muscle Sensors: Something like an MYO armband can determine hand gestures, but require a lot of processing overhead for my use case. They are also quite expensive.

    IMU: Acceleration and gyroscope sensors are cheap and do not wear out over time. However, determining a gesture from the data output of the sensor requires a lot of manual thresholding and timing to determine anything useful. Luckily machine learning can determine relationships in the data and even can be implemented on a microcontroller with tflite and TinyML.

    I chose to go forward with an IMU sensor and Machine Learning. My sensor is the LSM6DSOX from ST on an Adafruit Qwiic board.

  • 2
    Arduino Setup

    In the Arduino_Sketch folder of this projects Github repo is the AGRB-Traning-Data-Capture.ino. A script to pipe acceleration and gyroscope data from an Adafruit LSM6DSOX 6 dof IMU out of the USB serial port. An ESP32 Thingplus by SparkFun is the board I've chosen to use due to the Qwiic connector support between this board and the Adafruit IMU. A push-button is connected between ground and pin 33 with the internal pull-up resistor on. Eventually, I plan to deploy a tflite model on the esp32, so I've included a battery.

    Every time the push-button is pressed, the esp32 sends out 3 seconds of acceleration and gyroscope data over usb.

  • 3
    Build A Housing

    I used fusion 360 to model my arm and and a square case like housing. It took some revisions to get it to fit perfect and is actually rather comfortable even on a hot day in Hotlanta. All the cad files are in the linked github on this project.

View all 10 instructions

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates