Close
0%
0%

Unfolding Space

Spatio-Visual to Haptic Sensory Substitution Device for Blind People

Public Chat
Similar projects worth following
The Unfolding Space Glove is an Open Source wearbale that allows blind or visually impaired users to haptically sense the depth of their surrounding space and thus (hopefully) better navigate through it.
----------------------------

This project is also hosted on GitHub and has its own website (See links on the left). To keep redundancy low, you will only find an overview of the content here.

For more details please also visit the GitHub Repo

Theoretical Background

The device employs the concept of Sensory Substitution, which in simple terms states that if one sensory modality is missing, the brain is able to receive and process the missing information by means of another modality. There has been a great deal of research on this topic over the last 50 years, but as yet there is no widely used device on the market that implements these ideas.

Previous Work

Initiated in 2018 as an interaction design project as part of my undergraduate thesis, I developed several prototypes over the years, aiming to learn from the mistakes of other projects and use design methods to develop a more user-friendly device.

Study in 2021

The latest prototype presented here was build in 2021 and tested in a study with 14 blind and sighted subjects (publication expected in February 2022). Also see section "Abstract of Paper" below.

Demo Video

This is a demo from a user's perspective – recorded in the obstacle course 2021 study:

Video Documentation of Previous Prototype (2019)

Originally created for submission to Hackaday Prize 2019.

Abstract of Paper (Pending, Expected for Feb 2022)

In February 2022 the publication of a scientific paper in MDPI - Sensors about the project and the study is expected to take place. Here is the preprint abstract:

This paper documents the design, implementation and evaluation of the Unfolding Space Glove: an open source sensory substitution device that allows blind users to haptically sense the depth of their surrounding space. The prototype requires no external hardware, is highly portable, operates in all lighting conditions, and provides continuous and immediate feedback – all while being visually unobtrusive. Both blind (n = 8) and sighted but blindfolded subjects (n = 6) completed structured training and obstacle courses with the prototype and the white long cane to allow performance comparisons to be drawn between them. Although the subjects quickly learned how to use the glove and successfully completed all of the trials, they could not outperform their results with the white cane within the duration of the study. Nevertheless, the results indicate general processability of spatial information through sensory substitution by means of haptic, vibrotactile interfaces. Moreover, qualitative interviews revealed high levels of usability and user experience with the glove. Further research is necessary to investigate whether performance could be improved through further training, and how a fully functional navigational aid could be derived from this prototype.

Please_Find_All_Files_On_GitHub.txt

All files are hosted on Github to minimise redundancy. Please find the link on the left.

plain - 0 bytes - 01/31/2022 at 13:42

Download

  • 1 × Custom PCB 66 € | See GitHub Hardware Repo → PCB
  • 1 × The Glove 47 € | See GitHub Hardware Repo → Glove
  • 1 × Raspberry Pi Compute Module 4 40 € | CM4101016
  • 1 × Cooling Fan for the CM4 9 € | Waveshare – cm4-fan-3007
  • 1 × ToF 3D Camera 300 € | pmdtechnologies – Pico Flexx

View all 7 components

  • Recording of my Talk at 37c3

    Jakob Kilian12/30/2023 at 16:17 0 comments

    I am very happy, that I got the opportunity to speak at the 37c3, one of the biggest hacking and tech congresses in Europe, held by the German "Chaos Computer Club" in Hamburg just a few days ago. The talk has been recorded and you can rewatch it under the following link in English or German. What a bummer, that the tech demo didn't work out. It is always a risk, actually I have no idea what made this room or config so special that it failed... 

    Watch the talk at media.ccc.de

  • Paper finally got published!

    Jakob Kilian02/28/2022 at 18:35 4 comments

    In this world-political gloom, I do have a bit of good news to announce:
    Our paper on the glove has finally been published after months of writing, correcting and adapting. Like the whole project, the paper is freely available. 

    Click here to go to MDPI – Sensors journal and download paper and supplementary material. Or go on ResearchGate to download, comment or connect... Happy to hear your thoughts! 

    cheers,
    Jakob

  • Updated Hackaday Repo & Paper Publication Pending

    Jakob Kilian01/31/2022 at 16:29 0 comments

    Hey! 

    In addition to my Master's degree, I have been able to complete a lot in the last few weeks:

    • Uploaded software & hardware repo to GitHub with latest code, pcb files, instructions and a lot of docu on the project
    • Uploaded some exemplary videos from the aforementioned study that I did last summer
    • Wrote and submitted a 30 pages scientific paper for the MDPI – Sensors Journal, which is currently under review. Most probably it will be published in February! 
    • Updated this hackaday site with new pictures, instructions and info. Still, most files are now hosted on GitHub.

    Very happy to reach this point after four years in this project. I will keep you posted, when the paper is out!

  • Update: Study completed, Master's thesis in progress

    Jakob Kilian10/14/2021 at 14:58 1 comment

    Sorry for the big gaps in the project log. There is just too little time to document everything while working.

    So since the last post, there has been a second revision of my PCB. I changed some small things and added LEDs and a MEMS (gyroscope, accelerometer and magnetometer – to be able to turn of all motors when glove is not in regular position). It all worked out quite well, and just in time for the trial I had all 4 prototypes (2 sizes, one spare glove each) ready. You can have a look at the photos for details for now. Of course, I will upload material, blueprints and code as soon as I had time to put everything in order, probably in February when the thesis is finished.

    So after finishing the prototypes I was able to successfully conduct the study in August and September without any technical problems or other issues. I also attached some pictures from the study – thanks for the photos to Kjell Wistoff.

    Right now I am doing the transcription and analysis of the data gathered. In the next few months, I will write my master's thesis and submit it at the end of January. The thesis will be open access and therefore be uploaded as well. A paper on the study is also planned to be published. More on this as soon as there is solid news.

    So far for now...

    Total view of the system
    Battery worn on the upper arm

  • New Module with Raspi CM4 ready and working

    Jakob Kilian05/25/2021 at 13:09 0 comments

    Hey folks,

    a short post, because it is quite busy around here...

    I finished the first revision of my new module which now hosts:

    • the ToF camera (pico flexx)
    • a Raspberry Compute Module 4
    • my new motor-driver and CM-carrier board. 

    Now you only need to plug in a battery pack via USB-C and off you go!

    I will now test the unit to identify possible errors and pitfalls. Subsequently, I will carry out a pre-test with subjects, in which I want to gather first insights into the expected parameters in the obstacle course and concretize the preparation of the study...

    See ya!

  • Research Project Approved - Jay!

    Jakob Kilian03/19/2021 at 10:32 2 comments

    Hey folks! 

    Sorry for letting pass so much time since the last update. But now there is even more exciting news to report:
    I am very happy that now after some failed attempts at other institutes I am able to write my master thesis at the ZEISS Vision Science Lab (Ophthalmic Research Institute at the University Hospital Tübingen, Germany), from whom I also get a budget to conduct a small empirical study with 10-20 participants. Furthermore, as coincidence would have it, I was able to successfully apply for a grant to build the prototype tested in this study: The Kickstart @ TH Köln project (financed by federal funds) provides me with a budget to realize a revised version of the device. This way I can finally start to work through the list of shortcomings and get professional help for the trickier problems. 
    And no worries: the project stays open!

    The last few months, during which I have not posted any updates, I have spent little time on the device itself and a lot on defining the study design (and learning the necessary statistical basics for it), writing proposals, and soliciting formal bids for procurements. This has been exhausting at times, which makes it all the more exciting that everything is now in place and I can begin the actual work.

    So far for today. I will keep you posted when there is more to report!

  • August 2020 - Just a Short Update

    Jakob Kilian08/17/2020 at 14:39 1 comment

    Hi everybody!

    I just had the feeling, it is time for a new update on my project so that there is at least a post every few months. Because: the project is still running and I spend 1-2 days a week to push it further. Unfortunately there is not so much to show these days, as I am eagerly writing code on a complexity level way above my knowledge and skill set... 

    The main issue over the last weeks was to make the code more efficient (to save processing cycles and ultimately heat and battery usage...) and to let the calculations run concurrently. I therefore hat to rebuild and rewrite wide parts of my code which ended in quite a mess sometimes. However, I am nearly done now and I promise to push commits more frequently when the structure once has an acceptable form...

    Just some benchmark data:
    The whole processing of a depth frame now only takes about 3ms.
    The process of writing the values to the motor drivers takes about 5ms. 
    I can now easily reach the full 45fps on a Rpi4 OR let the cpu sleep for about 80% of the time if I use 25fps...
    The total latency is now at ~30ms – perfect for multisensory integration.

    I will keep you updated!

    stay save and healthy...

  • Working on the Monitoring App and MA Thesis

    Jakob Kilian04/27/2020 at 21:43 1 comment

    Hey folks. This is just a very short update to keep you in the loop:

    I am still working on this project – du to covid I do have even more time to put into the project. But at the same time I need to get the scientific part of my MA thesis done. Therefore I am eagerly reading literature and papers around the topic and preparing a little study with the device that hopefully takes place end of the year or so... No one knows how long this virus will take.

    For this I am preparing an app, that is remotely connected to the Raspi via UDP over Wifi, which allows me to see all the data, settings, bugs etc. This was an important step for me to boost the debugging process and to be able to support people with their first contact with the device (as I can see, what they "see" or feel...). So here is a small Screenshot to show you my work in progress... I am pretty happy about this already as the connection is stable at 25fps :-) 

  • Research Fund declined

    Jakob Kilian01/27/2020 at 18:17 2 comments

    Hey folks, 

    I am writing a short update on the project as there was no movement on a content but on a administrative level: I applied for a research fund (consisting of 10k €) and made it to the final round. Unfortunately I got declined so I again have to look for some more money to realize a empirical study about the device. mmmmh, could crowd funding work? 

    The good thing is: a lot of theoretical work is now already done due to the application and I still can use it for my MA thesis.. 

    I will keep you updated!

  • My project won a Hackaday Prize Honorable Mention!

    Jakob Kilian11/17/2019 at 11:50 2 comments

    The Hackaday Prize winners are out! And Unfolding Space won an honorable mention for "Best Concept".

    Once more I can express my admiration for the high level of quality and elaboration of the finalists' projects. I am super happy and I really feel "honored" (haha) to be among this year's winners.

    I definitely know where to spend money on and I am very happy to have this opportunity and privilege to continue to work on something that exciting and demanding at the same time.

    Thank for your support and also thanks to Supplyframe and the other Sponsors to make this happen.

    Cheers!

View all 23 project logs

  • 1
    Overview

    The glove has two layers of fabric with the vibration motors and cables in between. There is velcro to attach the Unfolding Space Carrier Board (see pcb directory of this repo) to it and a ribbon cable exiting the back of the glove terminating with an idc connector to link glove and board. The end result looks like this (two sizes: red = big and purple = small):

    The glove in its finished state.
  • 2
    Textile Work

    First of all, we need the inner and outer layers of the glove. You could sew these yourself, but this is very time-consuming. I therefore looked for suitable models which I could work with. It quickly became clear that both gloves had to be very thin, because otherwise the construction would be too thick and uncomfortable. In order to achieve a high degree of stretch and durability at the same time, I looked primarily for polyamide textiles. Most of them are work gloves with coating, which made the search more difficult.

    I found what I was looking for in a very thin inner layer of a polyamide clean room glove (article number 12025) from Munitec (they were kind enough to provide me with samples of their collection). They can also be cut well without becoming frayed.

    The outer layer should meet basic aesthetic requirements, look neutral and also be stretchable and resistant. Here it was a no name (Worksafe L71-720) product from Amazon (though I hate buying there).

  • 3
    Textile Work: Outer Layer

    I cut the fingers of the gloves halfway so that the two sizes would fit as well as possible on different hands.

    Cut the Fingers Halfway

    A kitchen knife next to the cut-in-half finger of the black glove on a wooden surface.

    To avoid fraying in the first place, it proved most effective to use a sharp kitchen knife

    Since I don't have a tool to hem the cut edges of the fingers (no idea how that would work), I experimented a bit with anti-fraying glue and the like and finally settled on liquid latex, typically used to make anti-slip socks. With some wooden stuffing of the fingers I was able to dip them into the latex and cut a clean edge after drying:

    Stuffing Material

    Short pieces of a wooden stick used as stuffing material.

    Just Before Dipping

    The fingers of the glove stuffed with the pieces of wooden stick just before dipping.

    After Dipping

    Fresh, grey liquid latex at the fingertips of the glove after dipping it.

    Drying ...

    Suspended glove with drying latex on the fingertips.

    Getting Rid of Overhang

    A cutter cuts away the overhanging latex lumps.

    Finish Edges

    A small pair of scissors lies on the table next to the glove. Fine cuttings lie next to them.

    Resulting in these edges that do their job of not fraying and not looking too bad:

    Result

    Macro shot of several gloves with focus on the finished, hemmed fingertips

View all 9 instructions

Enjoy this project?

Share

Discussions

sanskar329 wrote 02/05/2020 at 12:55 point

So new code has to be developed or modifications can be done in your code?

please help

  Are you sure? yes | no

Jakob Kilian wrote 03/30/2020 at 13:05 point

Hey Sanskar. Basically the Code is running quite ok. Maybe performance could be enhanced and maybe some work could be invested in building an app for a smartphone to "check" on the status of the device, edit the settings and see the actual camera data.... 

It is kind of difficult to arrange team work here, as the physical device is needed to so some tests, right...?

best, 

Jakob

  Are you sure? yes | no

sanskar329 wrote 02/05/2020 at 12:21 point

Can we use a 2D camera and a distance sensor for this project?

  Are you sure? yes | no

Jakob Kilian wrote 02/05/2020 at 12:34 point

As I described in my previous answer I wouldn't go for a 2D camera as too much cognitive effort is needed to localise objects on a plain image in a resolution that small... But what could work as a basic setup, is to use 9 single pixel distance sensors (like the GY-53L1 – one for each motor) and position them onto the glove... 

  Are you sure? yes | no

sanskar329 wrote 01/27/2020 at 17:59 point

Is it possible to do this project using a basic camera ? or atleast a cheaper camera,which can make the project cost effective.What problems will it have though ?

  Are you sure? yes | no

Jakob Kilian wrote 01/27/2020 at 18:12 point

Hey sanskar,

basically you could! There are also some documented experiments with wearable lowcost camera devices on the internet. But I specifically choose the 3D camera as it is way easier to "find" an obstacle in these data than to find it in a b/w video stream. You have to interpret perspective, motion parallax, occultation, shadows, relative size etc. etc. So learning to substitute vision by interpreting classical visual data as tactile input might take a lot longer and might not be possible with just 9 pixels like my glove. 
I reckon, that the price for ToF Sensors or other 3D cameras (like realsense or kinect) will decrease over the next years, so that when this things is fully developed, they will just cost a few bucks. Hopefully I am not wrong...

cheers!

  Are you sure? yes | no

Michał wrote 10/13/2019 at 11:38 point

Well done! Great work.

Why you did not use opposite side of hand? It seems to be more natural, more sensitive, more receptors are there...

Go further... Scale up effectors - put wider matrix on clothes, shoes. So you get much better, continous, general, directional orientation in space, regardless of moving parts of system like gloves. Of course, vibration modules are nice but still bulky, so maybe try delicate(!) HV, electroactive polymer (shape changing fabric) or piezo something...😸 If this method may be practical it should be as printable/flexible as possible, fluently integrated with clothes. A big challenge but it is possible...

  Are you sure? yes | no

Jakob Kilian wrote 11/09/2019 at 18:45 point

Hey Michal, sorry for the late reply. Somehow I oversaw your post...

Actually I choose to use this side, because I wanted to maintain the full functionality of the hand. Using the vibration motors, that couldn't be guaranteed on the palm. Furthermore I wanted the sensation as easy to learn as possible and during my trials I found out that it would be easier to have the sensation on the side where the sensed object actually is located, which is the back of the hand if the hand is in a natural position.

Yes, that would be an interesting step to do research on! Still the hand has one big advantage: the degree of movement. In cognitive science it is believed that (new) sensory input can only be learned by movement (and relating feedback) – the more movement the faster the learning. In my first experiments this theory has somehow been confirmed: you could even shake your hand in various directions but you still get the relation to the real world around. 

But what I definitely want to do implement ist a sensor system on the ground level to identify small obstacles lying on the ground. This would be a great opportunity to test other parts of the body...

And last but not least (sorry for the long answer) I am working on electro stimulation. This could be a small, lightweight, scaleable and low-power alternative to the motors. I just wanted to start with a more easy way (motors) before I grill peoples hand with high voltages (which you need for stimulation, haha). 

What is HV? High Voltage? 

Thanks for the tips, I keep them in mind!

  Are you sure? yes | no

Brandon wrote 09/17/2019 at 20:06 point

Hey just found this looking through the Hackaday Prize entries!  LOVE this idea.  

So we're working on a Depth sensing + AI platform (aptly named DepthAI) that allows object localization in real-time.  

Would be more than happy to send you one if you think it would be useful for this.  My first thought is that it would be... as it gives information on what objects are, and where they are in 3D space, real-time.

So then this can be used to map vibrations to the hand through your system.

We're also a hackaday project, here: https://hackaday.io/project/163679-luxonis-depthai

Thoughts?

And thanks for making such a cool system!  So we once did something similar for radio waves... so you could search out transmitters through vibrations on your hand.  And also the Commute Guardian bike safety system, which is why we're making DepthAI, it's our end goal, will use haptic feedback to the biker as well as a warning.  It's for giving 'eyes in the back of your head' for bike commuters.

Cheers,
Brandon & The Luxonis Team!  

  Are you sure? yes | no

Jakob Kilian wrote 09/18/2019 at 12:23 point

Hi Brandon,

thanks for your nice words! 

Just checked your project, it is amazing and super sophisticated. I always thought about something like this to add to my very basic orientation device. But I did not have enough engineering skills to do so… So you are absolutely right, this would be a great combination and I’d be extremely happy to receive on of the shields or work together with you!

I am super busy right know, due to work, other stuff and also (luckily) due to the Hackaday Price. I’ll send you a PM so that we can stay in contact and have a chat latest in october.

Thank you in advance. And also props for your great project – we live in a very interesting time.

Jakob

  Are you sure? yes | no

Brandon wrote 09/18/2019 at 19:21 point

Hey thanks for all the kind words!  And yes definitely will keep in touch.  Will keep you posted on hardware and firmware progress.  

Thanks again,

Brandon

  Are you sure? yes | no

Jakob Kilian wrote 02/15/2019 at 12:53 point

thanks guys. Just wanted to add: the basic idea behind that is kind of old – check out Paul Bach-y-Ritas amazing studies back in the 1960s (https://youtu.be/7s1VAVcM8s8?t=147) and loads of others (keyword sensory substitution). New is the 3D tof technology and cheap hardware in general. This brings everything on a new level...

  Are you sure? yes | no

Josh Cole wrote 02/15/2019 at 06:38 point

This is such a cool idea! I've been thinking recently about how to use other sensory mechanisms beyond eye sight in order to convey information. This is next level though, and could be life changing for some people. I can't wait to see/feel where it goes next!

  Are you sure? yes | no

Malek wrote 02/12/2019 at 19:24 point

great work keep going

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates