This Project started back in 2018 as my BA Thesis including a more theoretic and scientific research paper about sensory plasticity (find them under "links" but in German only, sorry). After finishing that first step, I constantly developed the device further and as the project was published open source, I decided to show my progress here. Feedback is very welcome...
A 3D generated picture from a depth camera is haptically projected on the back of the hand by using vibration patterns. The location of a vibration depicts an object’s relative position in space, the strength of the vibration represents its distance.
The theoretical background of this project is called Sensory Substitution. A phenomenon by which the function of one missing or faulty sensory modality is replaced (substituted) by stimulating another one – in this case the tactile modality. At the beginning of the substitution process, this new stimulation has to be actively interpreted by analysing the tactile stimulus. But after some training the new visual-like Input becomes implicit and gets processed subconsciously. Users begin to see the space in front of them.
These scientific efforts started more than 50 years ago, but even today there are almost no blind people using substitution processes to handle the absence of their visual system. All attempts to develop a device for the broad market failed.
Many projects failed on a practical implementation. An analysis revealed that despite the elaborate technology used by them, design and usability issues were often not taken into place. Therefore I followed an open and iterative rapid-prototyping approach to quickly work out strengths and potentials and to identify limitations of the hardware and algorithms. Even though my theoretical work about Sensory Plasiticity had already predicted many aspects, it was this process of prototyping that lead me to a functioning device that quick.