Abstract
This paper presents a conceptual prototype of a portable electronic assistant designed to aid visually impaired individuals in perceiving their surroundings through tactile feedback. The device utilizes a Time-of-Flight (TOF) LiDAR sensor (TF-Luna) to measure distances to nearby objects and translates these measurements into mechanical haptic feedback via a servo motor. Unlike traditional vibration-based systems, this approach employs servo positioning to provide proportional tactile cues, simulating a variable pressure or "extension" indication of proximity. Built on a Raspberry Pi 5 (4GB) platform with I2C communication, the prototype offers a low-cost, customizable solution for obstacle detection. The hardware setup, software implementation (based on manufacturer demos), and potential applications are detailed, with a focus on real-time navigation assistance. Initial user feedback has refined the project's focus toward adaptation support for newly blind individuals. Project website: www.feelbeam.app. Figures illustrate key components and prototype demonstrations.
Introduction
Visually impaired individuals face challenges in detecting environmental obstacles, often relying on white canes or auditory cues. Electronic aids incorporating LiDAR sensors can enhance spatial awareness through haptic feedback. This prototype introduces a handheld "flashlight-like" device where the TF-Luna LiDAR scans ahead, and a servo motor adjusts its angle to deliver tactile feedback—e.g., greater extension for closer objects, pressing against the user's finger for intuitive proximity sensing.
Initial Community Feedback and Refined Focus: Preliminary feedback from a member of the visually impaired community has been instrumental in shaping the project's direction. It was noted that while experienced blind individuals often develop exceptional spatial awareness through hand exploration and other senses, the device may hold significant potential as an adaptation and training tool for those newly adjusting to vision loss. This insight has focused the mission on helping users build confidence and develop mental maps of their surroundings during the critical early stages of sight loss, when traditional mobility skills are still forming.
The system is implemented on a Raspberry Pi 5 (4GB) running Linux kernel 6.6.51 (Debian-based), with I2C enabled in /boot/firmware/config.txt via dtparam=i2c_arm=on. To ensure compatibility, the user installed python3-rpi-lgpio after removing python3-rpi.gpio. The code draws from Waveshare's demo for TF-Luna, adapted for servo integration.
Hardware Description
The prototype uses affordable, off-the-shelf components for portability.
1. TOF Sensor: TF-Luna LiDAR Module
The TF-Luna is a compact, single-point TOF LiDAR sensor from Benewake, ideal for short-range distance measurement. It operates in I2C mode for integration with the Raspberry Pi.
- Specifications (from Waveshare wiki and datasheet):
- Ranging Distance: 0.2m to 8m (at 90% reflectivity indoors; reduces to 2.5m at 10% reflectivity).
- Accuracy: ±6cm (0.2m-3m), ±2% (3m-8m).
- Resolution: 1cm.
- Frame Rate: 1-250Hz (adjustable).
- Ambient Light Immunity: Up to 70Klux.
- Light Source: VCSEL at 850nm (Class 1 eye-safe).
- Field of View: 2°.
- Power Supply: 3.7V-5.2V, average current ≤70mA, peak 150mA.
- Dimensions: 35mm × 21.25mm × 13.5mm, weight <5g.
- Operating Temperature: -10°C to 60°C.
- Communication: I2C (slave mode, default address 0x10, max rate 400kbps).
Figure 1: Official photo of the TF-Luna LiDAR module from Benewake manufacturer (source: en.benewake.com/TFLuna). The compact module (35mm × 21.25mm × 13.5mm) features a low-cost ToF design for stable ranging up to 8m.
- Wiring (per Waveshare wiki):
- Connect to Raspberry Pi GPIO:
- Pin 1 (TF-Luna): 5V (RPi 5V).
- Pin 2: SDA (RPi GPIO 2).
- Pin 3: SCL (RPi GPIO 3).
- Pin 4: GND (RPi GND).
- Pin 5: Grounded (for I2C mode).
- Pin 6: Optional...
- Connect to Raspberry Pi GPIO:
Vasilii Belykh
Coders Cafe
w4ilun
Mike Coats