Close

Origin of Idea

A project log for SonoSight: The AR glass for hearing impaired

SonoSight uses the perceptual phenomenon of Synesthesia to map 3D sound as visual elements through AR.

manoj-kumarManoj Kumar 09/30/2017 at 15:200 Comments

I came across this idea after seeing amazing assistive technologies being developed for visual impairment. I was also inspired by the amazing work Dr.David Eagleman did at Stanford on Synesthesia. 


Synesthesia is a unique condition a few people have around the world by simulating  two or more sensory perception and ties one to another. (Eg: Hearing colors, Visual tactile). While I found the whole concept very interesting and appealing, I saw some potential in the way we could treat assistive devices. While people with sensory disability in sight, hearing or smell often find it difficult at start a lot of them begin to rely on their other senses to guide them through daily life. People with visual impairment often rely frequently on sound to assist them in crossing a road or using smell to identify people they regularly meet. A recent research found using Magnetic Resonance Imaging that other sensory processing regions in people afflicted with blindness were more active and showed evidence of remapping (link). I personally found the concept to be beautiful and representing the resilience of the brain. 

When I conversed about it with my friends Vignesh and Bala, We eventually struck on an idea to represent external sensory input in sensory impaired people in a way providing Artificial Synesthesia. The concept while appealing from a neuroscience perspective also had wide implications for people with such impairments. We decided on developing an artificial synesthesia system for the hearing impaired by mixing external sounds and visual stimulus. The reason was that existing efforts which took a similar idea in a different perspective.  David Eagleman's group had designed a vest to solve a similar problem by combining external audio signals with mechanical actuators to give a tactile feedback. 


VAC

(Figure from Scott Novich and David Eagleman)

While we did love the concept and the amazing work behind it, We wanted to take a new approach which would combine Augumented Reality based overlay with external sensors. 
The benefits of such an approach are:

1) Ability to provide quasi real time voice to text service around the hearing impaired
2) Lower cost due to the advent of low cost DLP projectors
3) Additional services provided by the AR glass
4) Information overlay would be intuitive and would require limited training.

While there are concerns already about the ethics of cochlear implants in the hearing impairment community, We want to make it clear that we see our device as an addition to amazing deaf community and their intricate sign language. Assistive technologies often walk on a tight rope so as to not take away from the amazing culture the communites have built and we would be willing to hear feedback to this end on how we can adapt in addition with it.

Discussions