close-circle
Close
0%
0%

Digital White Cane

The Digital White Cane can prevent frequent head injuries and provide intuitive navigation assistance for the blind and visually impaired.

Similar projects worth following
close
The Digital White Cane empowers people who are blind or visually impaired to live a more independent lifestyle. Using a discreet array of distance sensors arranged on an elastic band and worn across the forehead, it can detect objects within 2m (6ft). These measurements are converted to haptic feedback.

With minimal training, a blind person can develop a very natural sense of their surroundings. Using the Digital White Cane, they literally feel the space around them. Walking in an unfamiliar place is now dramatically more comfortable and enjoyable. With a new found confidence, getting from Point A to Point B is a safe and calm experience.

Problem: People who are blind and visually impaired face a huge number of mobility related head injuries.

Source: https://pdfs.semanticscholar.org/609b/3d62282eaa5c2e4a89c803c0b401a789e27d.pdf

Other Methods 

Physical canes and sonar devices are useful, but provide limited information.  Both are only measuring one point at a time.

A camera has multiple points and provides lots of informations, but unless you have stereo vision, the multiple points still do not measure depth. Without stereo vision, a pillar directly in front of you would be indistinguishable from a distance skyscraper.

Our Device

The Digital White Cane delivers immediate data and depth from multiple directions simultaneously.  This technology wasn’t possible before ToF LIDAR tech.  In the past, LIDAR was as big as a truck and very expensive.  We’ve created a lightweight and effective solution.

Cost

As 87% of the world’s blind live in developing countries (ref below), we endeavored to make a device that was inexpensive and easy to replicate.  When produced on a mass scale, our device is projected to cost less than $30 for materials and assembly.

R. Velázquez, Wearable Assistive Devices for the Blind. Chapter 17 in A. Lay-Ekuakille & S.C.

Mukhopadhyay (Eds.), Wearable and Autonomous Biomedical Devices and Systems for Smart Environment: Issues and Characterization, LNEE 75, Springer, pp 331-349, 2010.

Our First Real Life Test with Mary

Big Changes for Our Latest Device


document - 5.74 kB - 09/04/2017 at 11:27

download-circle
Download

digital_white_cane_v3_5_sensor.pdf

Schematic for the current head mounted version

Adobe Portable Document Format - 72.16 kB - 09/04/2017 at 07:53

eye
Preview
download-circle
Download

TOF.pdf

Schematic for the Time of Flight daughter boards

Adobe Portable Document Format - 20.41 kB - 09/04/2017 at 07:53

eye
Preview
download-circle
Download

digital_white_cane_v3_5_sensor.zip

KiCad files for the current version

Zip Archive - 36.36 kB - 09/04/2017 at 07:53

download-circle
Download

digital_white_cane_v2.pdf

First version of the Motor Driver circuit

Adobe Portable Document Format - 91.70 kB - 09/04/2017 at 07:53

eye
Preview
download-circle
Download

View all 7 files

  • 10 × Distance sensor The Sharp IR distance sensor
  • 2 × Teensy LC An Arduino Clone
  • 10 × Motor Driver ICs chips specifically designed to drive EMR (eccentric rotating mass) motors.
  • 10 × Pager motors

  • Final Step

    George Albercook09/04/2017 at 13:55 0 comments

    We have the multiplexer working properly.  In the test below it is addressing each individual motor in turn and making them twitch.

  • Head Mounted Version

    George Albercook09/04/2017 at 04:17 0 comments

    We made a lot of progress on a totally head mounted version. While these Sharp IR distance sensors were easy to work with and we had them on hand, there is better technology available.  

    In the past few years, the technology to measure how fast light travels  (the Time of Flight, ToF)  distancesas short as 1 millimeter to as long as 2 meters has become available in a tiny package.  

    Considering that light travels the length of a standard piece of notebook paper in one cycle of a gigahertz clock, that is kind of amazing. This technology is a form of LIDAR (Light Detection and Ranging). Think radar with lasers.  

    The photo below shows five breakout boards from Adafruit each with a VL53L0X ToF sensor made be  ST.


    Five tiny prototyping boards from Schmartboard with the DRV2606L motor drivers. 



    The motor driver boards attached to another Schmartboard holding an I2C multiplexer which is, in turn, connected to an Teensy LC.

    Both parts ready to be connected together.



  • Goodbye, Electric Dreads!

    George Albercook09/04/2017 at 02:24 0 comments

    Clearly the prototype was awkward to put on and take off.  Luckily, there is nothing about the concept that requires it to keep the original configuration.  

    The very first prototype used ace bandages to hold the pager motors in place on the forearms.  We chose the forearms because the skin in relatively thin and the nerve endings are numerous and close together. Plus, the area is not "used" for much else.  

    We tried several configurations and types of fabric to make a comfortable sleeve-like configuration.

    Here we tried stitching them into an elastic material. 

    This turned out to be awkward to put on. When putting on a shirt, it's surprising how much it helps that the sleeves are connected to the rest of the shirt.  It's similar to fastening your own bracelet.  One arm is less usable.

    Time for a new approach. We're going to put the whole thing in a headband.

  • Working with Haptic Signals

    George Albercook09/04/2017 at 01:56 0 comments

    After the rush of the Hackathon, we wanted to take a closer look at the fancy motor driver chips, which are made by Texas Instruments. 

    You might ask why not stick with the simple transistors that are working? The answer has to do with the physics of the ERM (Eccentric Rotating Mass) vibrating motors.  They have stiction. 

    When they first start moving it takes more power to start them than to keep them going.  This makes it very difficult to make them vibrate even a little. They also take time to get up to speed so it is difficult to get sharp transitions between levels. Thankfully, the DRV2605L chip addresses both of these problems and more. 

    The "L" suffix at the end of the part number indicates that it has a built-in library of vibration patterns. The library adjusts the power going into the ERM over time, resulting in refined vibration patterns. For example, it can overdrive the ERM for a short time to force it up to speed faster.

    We are still having some problems getting the DRV2605L chips to work.  At one point, we put one on the scope and it was clearly acting like it was trying to drive a huge capacitance.


    Several of the driver boards also had problems with drawing too much current.  We'll have to go through them carefully. Not at all unreasonable, considering we were working off of very little sleep and the pins of these things are half a millimeter wide. The soldering had to be done under a microscope.

    More updates soon!

  • Real World Test #1

    George Albercook09/03/2017 at 23:57 0 comments

    One of the world's leading vision research institutions the Kellogg Eye Center  happens to be right here in Ann Arbor, MI. 

    So I called to see if I could drop in on the Blind and Low Vision Support Group meeting.  I took our electric dreadlock prototype to see is anyone wanted to try it out.  Mary was kind enough to give it a go. 

    It only took her a few minutes of using the device, before she was willing to stand up and walk around without her traditional white cane. Very exciting. 

    We are so grateful to the Kellogg Eye Center and Mary, in particular, for helping us out.

  • Showtime!

    George Albercook09/03/2017 at 23:45 0 comments

    Time to show it off.  We were only given 3 minutes to give our presentation, but during the transition while computers, projectors and microphones were being connected, nobody said that I couldn't walk around in the front of the room. 

    While the rest of our amazing team gave the presentation, I continued to wander around demonstrating the device.

    The experience of wearing the device was rather striking.  I have seen other systems, for example, that converted images to sound, which required a lot of concentration.  

    Using the Digital White Cane, I was able to carry on a conversation with only occasional hesitations when I needed to really focus. While "feeling" the world through this system for the first time, I was trying to narrate my experience.  

    I think that since the system was on my head, some of the circuits in my brain that create a map of the world were able to work in a "native" way.  Once I "felt" my teammates standing against the back wall, even after turning away,  I didn't have to try to remember that they were there.  

    It "felt" like they were there just as if I had seen them and turned away. (It also felt like they would probably try to mess with me while I was blindfolded.)  In addition, I had no problem remembering where the table was once I had turned away from it. 

  • Day 2: Ann Arbor Health Hacks 2017

    George Albercook09/03/2017 at 18:13 0 comments

    Taking turns getting sleep, we made great progress over night. By morning, the sensor array was reading distance values.


    In spite of all the work on the multiplexer and the fancy motor driver chips, we were not getting them to talk to the Teensy LC microcontroller. 

    This is always the most telling point for a team. Things were going great when progress was fast, but we were about to see how things would go when the different parts are not coming together and time is short.  

    Everyone was tired.  We were only a couple hours from presentation time and getting no love from the motor driver chips. Time to change plans. 

    Looking through the parts that different team members had brought to the party, we decided to go old school and drive the motors with individual transistors. A quick schematic drawing and everyone jumped in cutting wires, soldering, putting on heat shrink tubing, and rewriting the code, while others put the finishing touches on the presentation.


    Sure it got nicknamed the "Electric Dreadlocks" but it was beautiful to us. It was our baby. 

  • Day 1: Ann Arbor Health Hacks 2017

    George Albercook09/03/2017 at 17:50 0 comments

    Having won third in 2016 as Team Q for our Personalized PT Assistant – a personalized real-time motion capture application to improve quality and adherence or physical therapy, It seemed like good fun to go back the second year.  

    The event started with some keynote speakers, followed by a call for anyone with an idea to make a pitch.

    I suggested that we could make device that would help the blind and visually impaired navigate their world with more confidence and safety. It got named the Digital White Cane pretty early in the weekend. 

    I guess my pitch was good enough because several people joined the team and we didn't sleep much for the next 36 hours.

    The group varied in age, as well as life and professional experiences. The perfect team! 

    They included: Isaac, Anthony, David, Jacob, Nick, Donnie, George, Rohita, and Victor.


    We set up shop in an open space on the second floor and dove in.


    Apparently, my hand drawn schematic for the multiplexer and motor driver board was slightly clearer than mud. That color code seems obvious to me.


    It did not take long before the team was suggesting that I go get coffee, so that I was not in the way.

View all 8 project logs

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates