Does this project spark your interest?

Become a member to follow this project and don't miss any updates

3D Haptic Vest for Visually Impaired and Gamers

This project's goal was to make a vest that a visually impaired or gamer could use when moving around in their environment(real or virtual).

Similar projects worth following
This project is submitted for

This project was created on 07/19/2014 and last updated a year ago.

This vest allows the visually impaired or blind to perceive their environment in 3D through its array of 48 vibration motors. This vest could also be used for gamers that play online or console games to give them cues and other important information. This vest is intuitive to use.



How the vest works:

This project uses a matrix of 48 vibration motors that wraps around the torso. The vest sends tactile feedback to the user in such a way that they can perceive the environment in front of them in 3d. This is done by using varying levels of vibrations. Eight levels of vibration were used of the possible 4096 levels. The closer something is to the wearer (or the louder the sound in gaming), the higher the intensity of vibration in the motors that correspond to the location of the object (or sound). Each motor used a maximum of 60 milliamps. If all 48 motors were running at the maximum speed for an hour, 2.88 amps would be consumed. A depth sensor, a computer, battery pack, and a microprocessor with IC’s (Integrated Circuits) are needed to run the vest. The computer, the battery pack, and the electronics are contained inside a backpack.

The depth sensor used in this project is the Microsoft Kinect. By using the Kinect the vest can still work in complete darkness. This was because the Kinect uses an infrared camera sensor.

The vest would use a stereo video camera (like one from an Xbox Kinect), processors, a computer, and a battery supply. The electronics other than the Kinect would be stored in a backpack.

The skin, a sense organ, contains the biological sensors of touch which are the thermoreceptors (responsible for thermal sensing), the nociceptors (responsible for pain sensing), and the mechanoreceptors (sensitive to mechanical stimulus and skin deformation.) Of the four kinds of mechanoreceptors, the one that is relevant to this project is Pacini corpuscles which respond to vibration.

The ability to discriminate stimuli on the skin is different throughout the body. The two-point discrimination threshold (TPDT) is a measure of how far apart two pressure points must be in order to be perceived as two distinct points on the skin. The measurements shown in the link were used in the design of this project. The link shows the TPDT for different areas of the body.

The final product used neoprene squares that were put on a 1” piece of foam. Vibration motors were mounted on the neoprene platforms. The neoprene platforms were mounted on the neoprene vest to be 2" apart. This vest worked significantly better because the motors did not spread vibrations throughout the entire vest.

Figure 2: Vest

The final design is a purchased neoprene vest with 48 motors on neoprene pads attached with Velcro. The vest has a 50-wire ribbon cable and a ground cable that connect to the electronics box. Solder was used to connect the ends of the wires to the ribbon cable at the back of the vest. Connections were insulated and fabric glue was used to attach double fold bias tape to cover the wires. All ground wires were connected together to a long ground wire.

Both the ribbon cable and the ground cable were attached to an electronics box. This box (see Figure 3) contained a specialized circuit board (see Figure 4) with 48 transistors and two LED driver boards attached. Also in the box was a propeller board, a ribbon cable plug, a USB plug and two power plug adapters.

Figure 3:  Electronics Box

Figure 4: Circuit Diagram

Figure 5 shows the flow of information from the depth sensor to the vest. The C# Master Controller Program (MCP) collects data from the depth sensor (which is the 3D video camera) and converts it to low resolution 3D video. That picture is sent to a communicator program. The C# Communicator Program takes the information from the MCP and sends it to the propeller microprocessor through the USB cable. The Parallax Propeller program takes the low-resolution 3D video and sends it to the LED drivers in the circuit box....

Read more »

  • 2 × Adafruit 12-Bit 24-Channel Led Drivers
  • 1 × Raspberry Pi
  • 1 × Parallax Propeller Microcontroller
  • 1 × Microphone
  • 1 × Laptop running Windows
  • 48 × PNP Transistors
  • 48 × Enclosed Vibration Motors
  • 1 × Neoprene Vest
  • 1 × Microsoft Kinect
  • 1 × 50-Wire Ribbon Cable

See all components

Project logs
  • How to use LED drivers to run vibration motors

    09/29/2014 at 01:20 0 comments

    The code to run the motors from the Parallax Propeller is here. This code is for the Adafruit 24-Channel 12-bit PWM LED Driver - SPI Interface (TLC5947). The following is an explanation of how it works. To run the LED drivers first send out a 0 signal to the clock and the latch. Then send out 12 bits of information to the driver and shift them in by sending a pulse to the clock. Repeat that process 24 times (one for each LED or Motor). When all the data is in, the program sends a pulse to the latch which starts the PWM. Then the program constantly repeats this PWM process.

    Note: There is a big difference for shifting in information between the Adafruit 24-Channel 12-bit PWM LED Driver - SPI Interface (TLC5947) which I got to work and the Adafruit 12-Channel 16-bit PWM LED Driver - SPI Interface (TLC59711) which did not work for me.

    The difference is that the TLC5947 has an input for the latch that the microcontroller can physically be attached to unlike the TLC59711 The TLC59711’s latch is controlled through sending a specific code then shifting in data for LED groups and individual pins for PWM of the 12 output pins. The speed for sending in this information exceeds the speed of Python running on the Raspberry Pi at 700 MHz (the default speed). If I used C or C++ on the Raspberry Pi or SPIN on the Parallax Propeller to send out the information I believe it would have worked properly.

  • Fast Fournier Transforms (FFT) on the Raspberry Pi

    09/29/2014 at 00:38 0 comments

    To run FFT on the Raspberry Pi I downloaded and modified the FFT Program. This allowed me to run real-time FFT. In that code they put down 2^11 as the minimum chunk (a piece of audio) size over the regular 44,100 Hz audio setting. Originally I used 2^13 for the real-time FFT because it worked on the Pi. In order to speed up the process so that the motors would react instantaneously to the noise I lowered the setting from 2^11 to 2^8 with a rate of 14,400 Hz. In order to lower the rate of audio sampling for playing the Call of Duty game I used a USB microphone instead of using a male-male adapter for the microphone slot in the USB sound card. The other one of the male-to-male adapters connects to the audio output of the Xbox.

    Lowering the Hz to 14,400 and the chunk size to 2^8 enables the Pi to handle the FFT computations which intern increases the speed proportionally from 5 times to 56 times each second. I was shocked that I was able to get this far with the processing power of the Raspberry Pi. In the process of making the program faster I send out 4 control bytes and a 2-character return byte for each FFT iteration to control the vest. This program takes up about 60% of the computational power of the Raspberry pi excluding the GPU  while still running the python program with c libraries. I was also shocked when I found that out. So in the future I will be making a better pattern recognition program and will hopefully remove the parallax propeller from the equation to have only one main microcontroller running the whole process. I also wrote code for running the vest at different frequencies so that anyone can use it for different games by changing the settings of the code a little. The Raspberry Pi required extra libraries PyAudio, PySerial, NumPy to run the code. This gamer vest is so fun! 

  • Gamer Vest - Feel like you are in the game!

    09/28/2014 at 23:12 0 comments

    I researched Fast Fournier Transforms (FFT) on the Raspberry Pi and decided to implement it in this vest for gamers. I tested the vest using the Call of Duty game. The wearer can now feel the recoil when they shoot their gun and other bass sounds like heart thumps from low health, enemy’s loud shots, grenades and rocket launchers. It’s really fun! The vest can be used in other games and also for watching movies. It would be fantastic if the motor array was embedded into the back of movie theater seats to add another dimension to the movie viewing experience. Kind of like the Mickey's PhilharMagic® show/movie at Disney World where you get to smell a scent or feel something run across the room by having your ankles tickled with air.

    FFT enables the Pi to make spectrograms of the audio from any sound. A pattern recognition program is used to determine when there are audio cues. Then a signal is sent through a USB cable to the Parallax Propeller which then sends information to the Adafruit 24 PWM LED drivers which turns on the motors in the vest.

View all 12 project logs

Build instructions
  • 1

    Procedure for Vest

    Purchase a Neoprene Vest.

    Put neoprene squares on 1" pieces of foam.

    Mount vibration motors on Neoprene platforms.

    Mount neoprene platforms onto the vest to be 2" apart in a grid pattern so that the vibrations can be recognized one from another.

    Solder the ends of the wires from the motors to the ribbon cable at the back of the vest. (I used 30 gauge wire. DON'T use 30 gauge, use a lower gauge but not too low.  I used a high gauge to prevent vibrations from spreading and it worked.)

    Insulate the connections.

    Attach double fold bias tape to the vest with fabric glue to cover the wires.

    Connect all ground wires to a long ground wire.

    Procedure for Electronics Box

    To control 48 motors at once (THIS IS AWESOME) follow the diagram.

    This uses 2 Adafruit LED Drivers, 48 PNP Transistors (that can handle at least 60 milliamps, mine could handle 800 milliamps), a box, 1 Parallax Propeller, and D Cell Batteries (they can have about 12 amp hours of electricity).

See all instructions

Enjoy this project?

Rishi Gaurav Bhatnagar wrote 03/03/2015 at 09:58 point

I have been trying to install Kinect on  a Pi, haven't been able to do that using the xxcorde documentation on github. How have you done it? Could you point me to a blog or something else I could use for this?

Are you sure? yes | no

Sean Benson wrote 03/03/2015 at 14:04 point

The Kinect works with the project for the visually impaired on a laptop.  The Pi runs a FFT program to understand what the sound from the video game means.  So they are two separate things that do not work together.

Are you sure? yes | no

peter jansen wrote 09/21/2014 at 18:08 point

Are you sure? yes | no

David Cook wrote 07/21/2014 at 00:59 point
Cool demo video!

Are you sure? yes | no

Sean Benson wrote 07/21/2014 at 04:00 point
Thank you!

Are you sure? yes | no

Similar projects