Project “AWAAZ” essentially uses a set of strategically (based on crypto graphical analysis of words) placed momentary tactile switches on a wearable system that sends characters or phrases to an Arduino board that processes the respective hand movement to form letters and words out of it. The Arduino board is programmed in such a way that it is capable of emulating an HID keyboard of the device that is attached. The letters are then sent to my app that converts the text to speech using standard android based text to speech conversion system that doesn’t require any internet connection through a third party word prediction software (reduces time by 32%). The system is designed to draw charge directly from the phone battery hence it doesn’t need any external power source, has a 100% accuracy rate and doesn’t have any limitations regarding connectivity. It also allows easy hand mobility with low finger stress, easily replaceable parts, and is developed inside a budget of 450 Rupees.
Project BATEYE fundamentally uses an ultrasonic sensor mounted on to a wearable pair of glasses that measures the distance to the nearest object and relays it to an Arduino board. The Arduino board then processes the measurements and then plays a tone (150-15000Hz) for the respective distance (2cm to 4m) till the data from the second ultrasonic pulse (distance) comes in, and then the same process gets repeated. This cycle is repeated almost every 5 milliseconds. The person hears sound that changes according to the distance to the nearest object. The head provides a 195-degree swivel angle and the ultrasonic sensor detects anything within a 15-degree angle. Using systematic, cognitive and computational approach of neuroscience, with the hypothesis that the usage of the occipital lobe of blind people goes into processing other sensory feedback., and using the brain as a computational unit, the machine relies on the brain processing the tone produced every 14 mS to its corresponding distance and producing a soundscape corresponding to the tones and the body navigating using the same. During experimentation, the test subject could detect obstacles as far away as 2 – 3m, with horizontal or vertical movements of the head the blindfolded test subject could understand the basic shape of objects without touching them, and the basic nature of the obstacles.
I'm a real fan of @Debargha Ganguly's work here. I am looking forward to helping with his projects in November when I will be available.