Close
0%
0%

Somatic - Wearable Control Anywhere

Hand signs and gestures become keystrokes and mouse clicks

Similar projects worth following
The Somatic is a wearable keyboard and mouse. It translates hand signs and motions into actions, like the somatic component of a spell in Dungeons and Dragons.

Each knuckle has a Hall sensor, and the first segment of each finger has a magnet. Flexing a finger pivots its magnet out of position, allowing the Somatic to map your hand.

An EM7180SFP IMU near the thumb provides 9-degree tracking. Eventually, this will allow you to move a mouse cursor by pointing, and type letters by drawing them in midair.

Thanks to Alex Glow for modeling the Somatic!

All files are on GitHub!

The Somatic project's priorities are:

  • Control any wearable computer with a heads-up display
  • Ready to use all day, instantly, with no Internet
  • Doesn't cause fatigue or interfere with other tasks
  • Fast enough to do a quick search in less than 10 seconds

The Somatic will not:

  • Reproduce your hand in 3-D space
  • Let you type on a virtual keyboard
  • Use any cloud services at all

The project is still in a pretty rough state. The roadmap includes:

  • Build a training utility
  • Collect gesture samples
  • Use artificial neural network to recognize letters
  • Implement gyro mouse
  • Implement Bluetooth stuff
  • Lay out and fab circuit board
  • Make case smaller
  • Replace on/off Hall sensors with continuous sensors

The Somatic project is MIT licensed, copyright 2019 Zack Freedman and Voidstar Lab.

  • Give 'em a Hand

    Zack Freedman01/04/2020 at 01:51 0 comments

    The Supercon really loved the data glove - enough to convince me to put it online.

    The project looks pretty good, but inside, it's pandemonium. Below the clean-looking case (I'm pretty proud of that, it came out great) the Somatic prototype is a mess of perfboard and sloppily-soldered dev boards. The code does basically nothing, and the training utility does even less.

    But, the hardware works, and I have my tasks ahead of me. From easy to hard:

    • I have code to recognize hands signs, output keystrokes, and do the gyro mouse stuff from a previous data glove. Gotta port that over to this hardware.
    • Finish the training utility. This will be fed into TensorFlow to recognize gestures. I've never actually built a desktop GUI app, but the learning curve of Tkinter is pretty forgiving.
    • Gather and process training data. I estimate I'll need something like 100 samples of each of the about 100 glyphs. I'll need some serious podcasts (and wrist support) to draw ten thousand letters in the air. Preprocessing should be pretty simple - the IMU itself produces unit quaternions; my job will be to normalize the sequences so they're the same length.
    • Make the artificial neural network. I've never done anything close to this before, but it seems pretty straightforward. The priority is making a network that can be easily ported to the Teensy. I'm not going to train directly on the glove - it'll just run the model.
    • Build custom PCB's and all that jazz to make it smaller and perform better. Normally I'd call this easy, but the Teensy 4.0's new chip is BGA and looks like a nightmare to solder with hot air.

    Anyways, stay tuned, I think I'm pretty close to finishing the training utility... as soon as I can figure out how to render the quaternions and gesture path...

View project log

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates