Close
0%
0%

FeelBeam: Haptic Distance Navigator

Handheld LiDAR device with sliding lever haptic feedback. Helps newly blind users build spatial confidence. Open-source, Pi5 >> ESP32

Similar projects worth following
64 views
0 followers
Helps newly blind users build spatial confidence. Open-source, migrating from Raspberry Pi 5 to ESP32-S3. Initial feedback from visually impaired community: 1. High potential for newly blind adapting to indoor navigation. 2. Limited value for long-term blind users with refined spatial skills. 3. Hands-on testing essential for validation. How It Works: TF-Luna LiDAR (I2C) measures 0.2–8m. Microcontroller processes data. Servo moves lever proportionally: 4m extended. Power: LiPo + USB-C (final design). Current: wired prototype on Raspberry Pi 5. Demo video (80s, audio described): https://youtu.be/Haf9EQ_t5JE. 3D renders & full details: https://feelbeam.app. Next: 1. Migrate to ESP32-S3 (BLE, USB-C, battery). 2. Hands-on testing in Berlin (DBSV/ABSV). 3. 3D-print pilot units. Files: feelbeam.py, article.pdf. License: CC-BY-SA 4.0. Contact: https://feelbeam.app/#contact, info@feelbeam.app. Published October 2025 - prior art for open assistive tech innovation.

Abstract

This paper presents a conceptual prototype of a portable electronic assistant designed to aid visually impaired individuals in perceiving their surroundings through tactile feedback. The device utilizes a Time-of-Flight (TOF) LiDAR sensor (TF-Luna) to measure distances to nearby objects and translates these measurements into mechanical haptic feedback via a servo motor. Unlike traditional vibration-based systems, this approach employs servo positioning to provide proportional tactile cues, simulating a variable pressure or "extension" indication of proximity. Built on a Raspberry Pi 5 (4GB) platform with I2C communication, the prototype offers a low-cost, customizable solution for obstacle detection. The hardware setup, software implementation (based on manufacturer demos), and potential applications are detailed, with a focus on real-time navigation assistance. Initial user feedback has refined the project's focus toward adaptation support for newly blind individuals. Project website: www.feelbeam.app. Figures illustrate key components and prototype demonstrations.

Introduction

Visually impaired individuals face challenges in detecting environmental obstacles, often relying on white canes or auditory cues. Electronic aids incorporating LiDAR sensors can enhance spatial awareness through haptic feedback. This prototype introduces a handheld "flashlight-like" device where the TF-Luna LiDAR scans ahead, and a servo motor adjusts its angle to deliver tactile feedback—e.g., greater extension for closer objects, pressing against the user's finger for intuitive proximity sensing.

Initial Community Feedback and Refined Focus: Preliminary feedback from a member of the visually impaired community has been instrumental in shaping the project's direction. It was noted that while experienced blind individuals often develop exceptional spatial awareness through hand exploration and other senses, the device may hold significant potential as an adaptation and training tool for those newly adjusting to vision loss. This insight has focused the mission on helping users build confidence and develop mental maps of their surroundings during the critical early stages of sight loss, when traditional mobility skills are still forming.

The system is implemented on a Raspberry Pi 5 (4GB) running Linux kernel 6.6.51 (Debian-based), with I2C enabled in /boot/firmware/config.txt via dtparam=i2c_arm=on. To ensure compatibility, the user installed python3-rpi-lgpio after removing python3-rpi.gpio. The code draws from Waveshare's demo for TF-Luna, adapted for servo integration.

Hardware Description

The prototype uses affordable, off-the-shelf components for portability.

1. TOF Sensor: TF-Luna LiDAR Module

The TF-Luna is a compact, single-point TOF LiDAR sensor from Benewake, ideal for short-range distance measurement. It operates in I2C mode for integration with the Raspberry Pi.

  • Specifications (from Waveshare wiki and datasheet):
    • Ranging Distance: 0.2m to 8m (at 90% reflectivity indoors; reduces to 2.5m at 10% reflectivity).
    • Accuracy: ±6cm (0.2m-3m), ±2% (3m-8m).
    • Resolution: 1cm.
    • Frame Rate: 1-250Hz (adjustable).
    • Ambient Light Immunity: Up to 70Klux.
    • Light Source: VCSEL at 850nm (Class 1 eye-safe).
    • Field of View: 2°.
    • Power Supply: 3.7V-5.2V, average current ≤70mA, peak 150mA.
    • Dimensions: 35mm × 21.25mm × 13.5mm, weight <5g.
    • Operating Temperature: -10°C to 60°C.
    • Communication: I2C (slave mode, default address 0x10, max rate 400kbps).

Figure 1: Official photo of the TF-Luna LiDAR module from Benewake manufacturer (source: en.benewake.com/TFLuna). The compact module (35mm × 21.25mm × 13.5mm) features a low-cost ToF design for stable ranging up to 8m.

  • Wiring (per Waveshare wiki):
    • Connect to Raspberry Pi GPIO:
      • Pin 1 (TF-Luna): 5V (RPi 5V).
      • Pin 2: SDA (RPi GPIO 2).
      • Pin 3: SCL (RPi GPIO 3).
      • Pin 4: GND (RPi GND).
      • Pin 5: Grounded (for I2C mode).
      • Pin 6: Optional...
Read more »

article-feelbeam-2025-10-23.pdf

PDF-article about a Raspberry Pi-Based Prototype

Adobe Portable Document Format - 414.41 kB - 10/26/2025 at 21:22

Preview

feelbeam.py

Main control script for FeelBeam prototype (Raspberry Pi 5)

x-python-script - 4.01 kB - 10/26/2025 at 20:59

Download

  • 1 × TF-Luna LiDAR Module The TF-Luna is a compact, single-point TOF LiDAR sensor from Benewake, ideal for short-range distance measurement. It operates in I2C mode for integration with the Raspberry Pi.
  • 1 × SG90 servo motor A standard hobby servo (e.g., SG90) provides rotational haptic feedback, replacing a linear slider. The servo angle adjusts proportionally to distance, creating variable tactile pressure.
  • 1 × Raspberry Pi 5 4GB
  • 1 × LED flashlight housing from "Noname"-device

  • Competitive Analysis – What Exists & Why FeelBeam Is Different

    Vasilii Belykh11/06/2025 at 20:18 0 comments

    1. ALVU (MIT, 2018–2025) – Belt with TOF Array

    • Haptic: Vibrotactile (intensity/frequency)
    • Form: Wearable belt + abdominal strap
    • Similarity: Proportional distance feedback
    • Difference: No mechanical lever, vibration fatigue, not handheld
    • Source: ALVU - https://dspace.mit.edu/handle/1721.1/114285

    2. .lumen Glasses (2024–2025)

    • Haptic: 100 Hz vibration on forehead
    • Form: AR glasses with LiDAR
    • Similarity: Real-time obstacle detection
    • Difference: Vibration only, head-mounted, $2000+ price
    • Source: https://www.dotlumen.com/glasses

     3. Miniguide (Commercial, 2000s–2025)

    • Haptic: Vibration frequency
    • Form: Handheld "flashlight"
    • Similarity: Portable, distance-based feedback
    • Difference: Ultrasound (not LiDAR), vibration, no mechanical slider
    • Source: https://www.gdp-research.com.au 

    Why FeelBeam Stands Out

    FeatureFeelBeamOthers
    Haptic TypeMechanical sliding leverVibration
    Form FactorHandheld flashlightBelt / Glasses / Cane
    Target UserNewly blind (adaptation phase)Long-term blind
    Cost< $50$500–$2000
    Open SourceYes (CC-BY-SA)No

    Conclusion
    No device combines LiDAR + mechanical proportional haptic + handheld form for adaptation training.  
    FeelBeam fills a gap — and we’re open to collaboration.

    → Seeking feedback: Have you seen similar mechanical haptic solutions?  
    → Next: The Portable version & Hands-on testing with newly blind users in Berlin

    Feel free to comment or DM @FeelBeamApp

  • X.COM account @FeelBeamApp

    Vasilii Belykh11/06/2025 at 07:11 0 comments

    I've made the X.COM (Twiiter) account @FeelBeamApp to publish updates!

    The project is live :)

    FeelBeam X.COM

View all 2 project logs

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates