Pathfinder - Haptic Navigation

Wearable Navigation Assistance for the Blind

Similar projects worth following
Pathfinder is a wearable device that translates distance into haptic feedback. Users just wear the wristband (or glove) and point at objects up to 500 centimeters away, and feel gentle pulses at their fingertips corresponding to the object's distance. It’s designed to give the user greater freedom of motion and longer operational range than traditional navigation solutions for the blind, such as the cane. I incorporated research ranging from embedded electronics to the neuroscience of touch to turn this simple concept into the best prototype I could, before sharing the device with my local community center for the blind.

Author's Note: This writeup pertains to Pathfinder's "Milestone 1," which is my first really usable prototype that somewhat resembles a consumer device. I'm currently working on Milestone 2, which adds smarter haptic feedback and moves to a fully custom, integrated design. That's what future project logs will be documenting.

Step 1: Getting Distance Data

I began by sourcing the cheapest ultrasonic sensor I could find: the $2 HC-SR04 sensor. It incorporates a transmitter and receiver, operating on 60 kHz sound waves, along with their appropriate drive & timing circuitry into a single module. I then interfaced this sensor with an ATmega328P micro controller, at first through the Arduino platform to make development as simple as possible. In testing this sensor, I measured consistent 0.5cm precision out to a maximum range of 500cm, with a 30º cone of detection. However, accuracy remained off by around 5%. This was due to the impact of ambient temperature on the speed of sound, and so I integrated data from a TMP36 analog temperature sensor to achieve accuracy within 1% of actual values.

Haptic Feedback

I used small, circular motors built for haptic feedback in mobile devices. A motor is placed on the pinky fingertip of the user's non-dominant hand, allowing for good tactile sensitivity while remaining minimally intrusive. To convey intensity, we manipulate the frequency of gentle pulses, whereas most traditional systems vary vibration strength. This is a really delicate process, as having the best sensors and data in the world won't help unless I can actually convey that to the user. My main challenge so far has been resolution: It's fairly easy to tell apart, say, the different quartiles of the systems' range, but we need to do better. First, I asked a focus group of future beta testers to identify the ideal range for an assistive navigational device. Most agreed that 250cm, or just over 8 feet, would be a useful extension over their existing options and that they would see diminishing returns in utility after that point. For my first prototypes, I just used a simple linear scale that changed the delay between haptic pulses. What is a haptic pulse, you ask? Even simpler: A uC pin goes HIGH for roughly 20 milliseconds, driving the gate of an N-channel MOSFET to switch a simple ERM motor. I have big plans for this cobbled-together assembly, however, and I'm now looking at LRA motors + specialized haptic driver ICs. 

Bringing It Together

I now had to integrate the system into a single device. I switched to a much smaller controller board and arranged the parts on a prototyping grid, hand soldering the connections with jumper wires. A 9V battery provided a primitive, yet portable power source. This early prototype measured 50x70mm, and weighed 115 grams. The assembly was attached to a utility glove with velcro and tested; users reported much of the obvious: the device was bulky, had poor weight balance, and the loose glove made for poor tactile feedback.

From Prototype to Product

With my basic design vision physically realized, I became significantly more ambitious. Over a two week development sprint, I learned and employed EAGLE CAD to create PCB layouts for my prototype, which enabled me to add features such as an accelerometer/gyroscope, support for lithium-polymer battery packs, and an additional wrist motor for complementary feedback patterns; all while significantly reducing the footprint. I manufactured the 35x55mm board with resources from my school's chemistry lab (cupric chloride etchant, as well as the essential fume hood + personal protection equipment!. The glove too was replaced for a more elastic variant, and the motor was sewn into the fingertip to ensure tactile sensitivity. Overall, the board was 45% smaller and weighed 60% less (45g), while improving functionality.

Iterative Improvement

Mark III has a slight board shrink (30x45mm, 40g) and systems upgrade. The 8-bit 16MHz AVR processor was swapped for a 32-bit 96MHz...

Read more »

  • 1 × ATmega328P-AU 8-bit AVR microcontroller, same as Arduino Uno et al. This one is the SMD TQFP-32 variant.
  • 1 × InvenSense MPU-6050 The accel/gyro combo chip, complete with a "DMP" that claims to do sensor filtering for us. Jeff Rowberg has an incredibly valuable library over at I2Cdevlib
  • 1 × TI DRV2603 Haptic Driver Allows drive of ERM and LRA motors. Incorporates specialized drive patterns that supposedly allow finer feedback control than power transistors... we'll see.
  • 1 × MCP73871 LiPo Battery Charger (Microchip) Standard LiPo charging IC for onboard battery management.
  • 1 × MCP-1700 Very small form factor LDO from Microchip. Low cost, low power, and well supplied.

View all 9 components

  • September Snapshot

    Neil09/21/2015 at 05:09 0 comments

    The road so far:

    We've got a very stable HW/SW prototype! (check the github for the latest files).

    Pathfinder has been validated in a bunch of different situations (outdoors, public environments, around complex furniture, etc.)

    The board can be built for under $25 in small volumes, so we've also achieved our accessibility goals.

    (a lot of improvements are under the hood. it takes quite a bit to reach a "milestone!")

    So what's next?

    Haptics has become a rather active space in the last few months - Apple's 3D/Force Touch has attracted a lot of attention, and shows what I believe to be the future of haptics. I've been playing with (and dissecting) their hardware, and I've got a whole lot of hints to incorporate into Pathfinder. Essentially, we're going to work with more advanced haptic actuators, and add "sub-pulses" into each traditional pulse, adding a lot of nuance and detail to the feedback that's also more intuitive - meaning faster training and a better UX.

    Gesture support is coming soon! I've reworked the accelerometer handling code, so we can now detect patterns over time and respond accordingly.

    Getting a proper enclosure to work with will better define the ways we can interact with the user - from ergonomics to feedback. Look out for that soon!

  • Quick HW Update

    Neil06/29/2015 at 07:27 0 comments

    I haven't done a deep dive into the current hardware revision, but now seems a good time to update the BOM in the spirit of THP.

    This is the current HW schematic (DipTrace, by the way), and I'll just briefly point out the parts of especial interest to HaD and the sponsors:

    1. ATMega328P: Of course, we have a very popular AVR running the show. The TQFP package is remarkably easy to work with at the home lab level, and of course the chips themselves take a lot of abuse when prototyping. I've deployed dozens of these across the various prototypes Pathfinder has been through, and it isn't leaving anytime soon. (Low power, simple layout, easy SW dev). When we get closer to production, however, I'll consider moving to an ARM chip (SAM series, maybe?) that might offer more HW amenities at the price of increased complexity.
    2. MCP7381: The venerable LiPo charger IC found in so many designs. Very easy to work with compared to some of the very fine pitch DFNs other ICs come in, and the power rating (500ma) is great for a wearable system. Also the best price/performance I could find in reliable stock, so it was an easy choice.
    3. MCP1700: A simple 3.3V LDO, chosen for its low cost, tiny package, and clean output. It feeds the IMU on a private voltage rail, providing a layer of much needed immunity from the noisy haptic motors.
    4. TI DRV2603: Unfortunately, this chip has not been worked into the schematics yet, but will be included on the next stepping. It's a little complex, and it took me a while to get the samples working on a little testbed a built. Now, I'm comfortable enough to bring the IC into the main design, where it will replace a primitive discrete H-bridge that has given me a bit of a headache to date. I picked this chip to seamlessly drive LRA motors, which offer much improved haptic feedback experiences over ERMs. Being able to outsource the complex frequency matching and drive patterns to this chip allows me to focus on the user experience vs. implementation semantics. I'm glad to work with a prevalidated solution, and can't wait to get the drivers onboard. Should be a few weeks or so, but I already have a proven (if on a testbed) layout, so it should be relatively easy.

    That's all for now, but I'll update the project description soon to better reflect all the progress we've made over the past year. Thanks!

  • G2 Validated / SW Milestone

    Neil06/29/2015 at 06:34 0 comments

    Hey everyone,

    Major progress has been made with the G2 revision; I've been busy all week writing software and experimenting with the boards. Here's what's new:

    First off, I've cleaned out my Github repo to make my workspace public and (hopefully) easily accessible. Take a look to see the codebase, which is where most of the improvements have been made. is the link; not sure why it doesn't show up on the project page!

    Most significantly, the IMU (MPU-6050) is working far better than expected; thanks to Jeff Rowberg's libraries, I was able to get the DMP working great in my implementation. Thus, we have excellent orientation tracking of the user's hand, parsed at 200+ Hz thanks to the MPU's dedicated data filtering logic.

    So far, this is being used just to shut off (low power sleep) Pathfinder when it is brought and held below a threshold angle (a 25 degree angle of elevation from ground). I'll keep thinking about other cases to detect and respond to, as well as gesture controls to implement. Please weigh in with any ideas/suggestions!

    For other HW developers out there, the MPU-6050 has my recommendation as a solid starter IMU platform. The DMP and libraries available for it make software development a breeze, and obviously the HW platform is simple enough, as I've got the chip working on a basic 2-layer PCB on the first spin tested.

    Moving forward with software, I'm working on creating a more stable, mature platform that can poll all of Pathfinder's systems efficiently and cut power consumption.

    Finally, here's a quick picture from my phone (no SLR, unfortunately):

    Next time, we talk about HW rev G3!

  • Checking back in!

    Neil06/04/2015 at 09:45 0 comments

    Hey everyone, it's been quite a while since the last update! Development was intermittent over much of the past 9 months, but all those steps have added up to a new milestone revision, which has finally pulled all the parts together.
    Pictures coming soon, but here are the major changes:

    1. Added H-bridge drive scheme for the motor, allowing more precise control over haptic patterns (active braking, for example)
    2. Integrated LiPo charging to the board, then took it off temporarily as I work out the more critical hardware
    3. Significant improvements to EMC and general grounding/signal integrity, thanks to greater accumulated knowledge. Not entirely necessary, but good to have!
    4. Added a micro-USB port for serial debug and charging (actually, the UART lines run directly on the TX/RX lines of the USB; they lead to a downstream FTDI board. Basically, I just like the micro-USB connector interface).
    5. Shrinks all around. Total dimensions are significantly reduced, making it cheaper to manufacture and hopefully easier to integrate, eventually.
    6. Full code cleanup, hopefully more readable than before. This isn't on Github yet, my sources are all kind of scattered and I need to redo my whole VCS setup. Everything, from HW to SW, will be up soon.

    And onto the major, short term to-do list:

    1. Still gotta get that accelerometer online and verified. I've played with the MPU-6050 before, and have the basic code + some preliminary functions ready to go in the source, but haven't deployed yet since every board has a little defect that prevents use of the accel. I just lost another board to an unrelated power issue, so the new fixed boards should come in sometime this week, at which point we'll re-examine the IMU state of affairs. Certainly, it is the most complicated SW block in the project yet.
    2. Rebuild the full MVP and re-test usability characteristics, from user experience to battery life.
    3. Consider rolling my own ultrasonic sensor to enable a better form factor.
    4. And finally, start considering "wearable" integration more seriously. We have at least progressed to a less obstructive wristband, but actual ship-ready products can't have PCBs or components exposed, so I need to look into an enclosure or some other creative deployment scheme. Still a ways off though.

    I'll also fix the lack of pictures soon. This is just, as the title states, a check in to confirm that Pathfinder is moving forward, but we're slowing down a bit as we move out of the barebones prototype phase. Additionally, I'll have much more time over these next few summer months, so progress is coming soon!


  • Wireless Charging

    Neil08/21/2014 at 06:49 0 comments

    Daily usability is a big concern when designing a device for a disabled population. I take a lot of things for granted, like being able to just plug my devices in when I need to or even hotswap batteries. The reality is, however, that these mundane tasks add a lot of overhead for the blind.

    So I need to minimize maintenance of the device. 

    Wireless charging is perhaps the easiest way to minimize user requirements, so I dropped a little micro-USB header on the board and am plugging it into a generic Qi reciever for cell phones. Will update soon with pictures, charging performance, and a schematic!

  • Quick Update - Project Video!

    Neil08/21/2014 at 06:45 0 comments

    Here's the video going over Milestone 1 of the project!

  • Using IMU (accel/gyro) Data

    Neil08/21/2014 at 06:43 0 comments

    I really like the MPU-6050. It's a small, 4x4mm QFN chip that has an accelerometer, gyroscope, and a legendary "DMP" that we'll get to in a minute.

    But first, why do we need spatial data?

    At first, my goal was to give the wearable device a bit of contextual information: the user is probably not using the device 24/7, and asking them to turn it off (which currently amounts to pulling out the battery :) ) isn't feasible. So the device needs to be able to sleep at times when reporting distance is useless (if the device is sitting still, potentially not even worn) or annoying (vibration when the device is pointed to the floor while walking.) 

    Now, Milestone 1 (heretofore MS1? Please?) has a "dumb" spatial response: if the device is unmoved for >2 minutes, OR if it is pointed more than 60 degrees below the horizon, it enters sleep mode.

    But there are potentially some use cases where these situations are a part of normal operation: pointing at steps, for example, would not be possible with this blanket rule. 

    I'd like to detect discrete gesture patterns, and define device behavior that way. For example, detecting a natural walking gait, and sleeping only during the bottom sweep of the swing. 

    This would also be rather important in prolonging battery life, as the motor is by far the most significant power consumer. 

    In any case, that's where I'm headed, but I'll need to wrap my head around those beautifully complex quaternions returned by Jeff Rowberg's MPU-6050 library first.

    Also, sourcing expensive sensors from China often ends poorly :(. My current boards do not work with the raw MPU-6050 chips coming from China; they return wildly unstable values but otherwise expose the right data structures and I2C commands. It remains to be seen whether this is a fault with the ICs, or the layout itself (lots of passives to support the chip!).

  • On Haptic Feedback

    Neil08/21/2014 at 06:10 0 comments

    At this point, "Haptic Feedback" seems more an art than a science.

    One of the central challenges in this project is the communication of sensor data to the user in an intuitive manner. Now, blind people are often described as having significantly enhanced tactile senses, which would make sense considering the precision and sensitivity that's needed to quickly parse Braille. So when I first brainstormed feedback mechanisms, haptic feedback seemed an obvious choice. 

    I started with this funny-looking prototype:

    Prototype (marked "Alpha") on top, current LRA iPhone motor on the bottom

    It was a DC brushed motor with a piece of rubber glued to the shaft. When the motor spun, the imbalanced weight would cause the whole frame ("frame" is a fancy word for a piece of PVC pipe, taped to the motor) to vibrate. 

    The very first breadboard example just varied the voltage available to the motor (through PWM). I immediately ruled that out as an unusable option: It was near impossible to feel subtle changes in vibration intensity; in fact, I had an easier time listening to the motor's hum to discern changes.

    I started working on a better feedback system right away. Not too many hours later, I had a new version setup that pulsed the motor regularly, and by varying the time between discrete pulses, I had a much more workable sensory range. In fact, I was so satisfied with this new pulse based model that I hardly changed it at all while the rest of the design evolved significantly. 

    Switching to a small, disk ERM motor, not unlike the LRA version pictured above, allowed for much more refinement in each pulse, as the small motor was faster to spin up and slow down. 

    But the software was still limited to a simple linear relationship between pulseDelay (time between discrete pulses, which are 20ms blocks of motor ON time) and distanceToObject. This setup provided a solid 10cm resolution over the 250cm target range, but 

    1) that's not good enough (it never is!) and

    2) 10cm is waaaay too large at ranges <50cm, and not nearly as important at ranges >150

    So what we really need is a curve of some sort. I tried my hand at modeling something in Mathematica:

    Old linear function is a line, New curve function is a curve

    But most people who tried it felt little, if any, difference. 

    That's when I realized what I really wanted: a logarithmic function. 

    Humans don't feel numbers - what I mean by that is, we're not sensitive to numerical changes in the pulseDelay. Although we maybe can feel the difference between 1000ms and 500ms delays, we certainly don't think in terms of the time between the pulses: we're more inclined to say things like, "oh, the frequency just doubled." Or at least that's what makes sense in my head.

    With that in mind, I believe an ideal "quantitative haptic feedback" system should be focused on a base 2 logarithmic scale, whereby a user can easily point to different points on the range between which the pulse frequency roughly doubles. 

    This system has the fundamental feature of redistributing our finite feedback range (by the way, "feedback range" refers to the fact that the motor can only pulse so quickly and slowly without becoming annoying or insufficiently informative. In my experience, this range is from 50ms to ~1000ms.). Now, the pulse frequency will double over fairly small distance changes when in close proximity, while the same doubling points will be spaced much further apart at longer ranges.

    So that's the game plan to bring our haptic feedback to the next level. I'm also looking into haptic driver ICs that allow us to:

    1) use cool LRAs that are supposedly superior to ERMs in providing haptic feedback

    2) potentially output cleaner, more controlled pulse than a simple square wave through a transistor.

    More on the new IC soon! Unfortunately, they cost >$2 in small volumes, a big step over generic transistors...

View all 8 project logs

Enjoy this project?



oshpark wrote 08/23/2016 at 18:15 point

Awesome project! Any updates for 2016?

  Are you sure? yes | no

Winston wrote 08/23/2016 at 14:00 point

What do you think about installing one of these as a head mounted unit rather than on the wrist allowing range detection using just the normal head movement to test a particular direction? 

  Are you sure? yes | no

praetorian wrote 01/18/2016 at 04:02 point

Get out of my head! Haha, it's so weird but I was thinking of doing this exact same thing several years ago. Bought all the parts (bicycle mesh gloves, miniature vibrating pancake motors, ultrasonic sensors) but nothing eventuated of it. Now I come across your project and I'm like, wow, that's everything I need to get started again. Even though the need for mine is no longer there (I had a friend I was going to gift it to, but sadly he passed) it would still be a cool project and if I keep it with me, I may find someone who I can give it to in need.

Anyway, not trying to steal your thunder or anything but just find it amazing that such similar ideas can occur elsewhere. Great project and I'll be looking forward to stealing code etc for inspiration! Suppose I at least owe you a skull for that :D

  Are you sure? yes | no

arun wrote 07/11/2015 at 13:52 point

neil can u give me full documentation of the project as wess as send me a email to how to do this project

  Are you sure? yes | no

Jasmine Brackett wrote 07/13/2015 at 22:49 point

Hello Arun, there is a lot of info on this page, and also a link to the git repo. This project is not set up as a tutorial (not all projects on are). Probably best to set up another project and start tinkering on your own. If you run into specific problems, people around here will be ready to help if you ask nicely. Good luck!

  Are you sure? yes | no

Neil wrote 08/14/2014 at 03:52 point
Hey Adam and Dave,
Thanks for the comments! I'll get the update stream up soon, I promise :). But to answer the immediate questions:

The sensor is supposedly good to 10m (!) but I've personally validated it to ~600cm. After that, two things happen - first, we lose centimeter level accuracy, and second, the cone of detection just gets too wide to be useful, i.e. extraneous objects, well off to the sides, are detected. So I cap the maximum range in software to just 500cm, which I think is pretty good range for daily navigation. On the other hand, mapping that massive distance range to haptics turns out to be really tough, and there's a lot I have to say on that in the very near future.

The accelerometer (and gyro, I'm using the MPU-6050) is really important to the device's practicality. For one, the use case that initially earned it a spot on the board is orientation detection; when the device is held down to the floor for, say, >5 seconds, the motor should probably stop bothering the user with reports of what is ostensibly the floor! Similarly, maybe a minute of little to no motion could put the device to sleep. Pretty simple, but when I get more comfortable with the IMU, I might even be able to do things like sense the user's gait and disable the device as the hand passes a critical angle, but still detect when the natural walk pattern is broken and thus allow downward sensing when explicitly desired. Why is it so important? Battery life: aggressive sleep is vital to be able to drive a motor all day! Furthermore, buttons would be out of the question for this device, so maybe hand gestures could be used for a very primitive UI. I'm still finalizing the hardware for now, so unfortunately I haven't been able to take a deep dive into the MPU-6050 and all its craziness.

Again, I'll be sure to address all this topics in a more formal writeup. I actually have been working on this on and off for the past year or so, and but I'm only just getting settled on Hackaday Projects :). Thanks again for the comments!

  Are you sure? yes | no

davedarko wrote 08/29/2014 at 23:05 point
Loved the end of the project video where he was happy to find the chair :)

  Are you sure? yes | no

Adam Fabio wrote 08/14/2014 at 03:31 point
Awesome project Neil! How much resolution do you get with the sensors and haptic feedback? I.E. can the wearer determine if an item is 20 cm or 2m away?

  Are you sure? yes | no

davedarko wrote 08/13/2014 at 22:34 point
Hey, cool idea! what are you using the accelerometer for? Seems like an interesting experiment to blind fold someone and let him use only those gloves to go around. Any reasons you took a teensy and not something "smaller"? anyway, nice project.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates