Close
0%
0%

Unfolding Space

see with your hands

Public Chat
Similar projects worth following
The “Unfolding Space” glove is a prototypical open source navigation aid system for the visually impaired.The project deals with sensory substitution, a phenomenon by which the function of one missing or dysfunctional sensory modality is replaced (substituted) by stimulating another one. The outcome is a prototype that projects a 3D picture, generated by a special camera, as vibration patterns on the back of the hand. Visually impaired people can therefore use their tactile modality of the hand to actually see.Thanks a lot to pmdtec (pmdtec.com/picofamily) for their support by providing a Camboard Pico Flexx ToF Camera for free. This made things a lot easier for me...

This Project started back in 2018 and was initiated as my BA thesis project. I published everything open source to inspire others to work on that topic. You can find all the files, instructions, codes, papers and materials linked on this project page.

As I'm an Interaction Designer, this mainly is a design project. Regarding soft- and hardware development I'm still learning and I am sorry if some things are not in line with other well ordered projects. Don't hesitate to drop me some lines, if you have any question or feedback!



Basic Functionality

A 3D generated picture from a depth camera is haptically projected on the back of the hand by using vibration patterns. The location of a vibration depicts an object’s relative position in space, the strength of the vibration represents its distance.

Scientific Background

The theoretical background of this project is called Sensory Substitution. A phenomenon by which the function of one missing or faulty sensory modality is replaced (substituted) by stimulating another one – in this case the tactile modality. At the beginning of the substitution process, this new stimulation has to be actively interpreted by analysing the tactile stimulus. But after some training the new visual-like Input becomes implicit and gets processed subconsciously. Users begin to see the space in front of them.

Motivation

These scientific efforts started more than 50 years ago, but even today there are almost no blind people using substitution processes to handle the absence of their visual system. All attempts to develop a device for the broad market failed.

Process

Many projects failed on a practical implementation. An analysis revealed that despite the elaborate technology used by them, design and usability issues were often not taken into place. Therefore I followed an open and iterative rapid-prototyping approach to quickly work out strengths and potentials and to identify limitations of the hardware and algorithms. Even though my theoretical work about Sensory Plasiticity had already predicted many aspects, it was this process of prototyping that lead me to a functioning device that quick.

Here is my old documentation:

General_Shematic-V02.pdf

A schematic of the basic configuration of the system. There is a separate diagram for the motor board PCB driving the vibration motors.

Adobe Portable Document Format - 2.65 MB - 11/09/2019 at 18:34

Preview
Download

Computing_Unit_Case-V01.stl

This is the STL export of the custom-made case. Go to https://a360.co/2ojlpKu to see the fusion file in the browser and download it in various file formats. After the case is printed follow the instructions for assembly.

Standard Tesselated Geometry - 635.92 kB - 11/09/2019 at 16:02

Download

Motor_Board_V02.zip

PCB Design for the "Motor Board V0.2" including: - Eagle File - Gerber File - Aisler.net project (https://aisler.net/p/DMKOGATR) ready to order incl. parts Check the instructions for more information

Zip Archive - 395.44 kB - 11/09/2019 at 16:00

Download

Motor_Board_V02-Schematic.png

Schematic diagram of the Motor_Board

Portable Network Graphics (PNG) - 220.79 kB - 11/09/2019 at 16:00

Preview
Download

Parts_List_V02.pdf

PDF Parts list with links and prices

Adobe Portable Document Format - 76.58 kB - 11/09/2019 at 16:00

Preview
Download

View all 8 files

  • 1 × PMDtechnologies Pico Flexx 300€
  • 1 × Custom Made PCB ca. 50€
  • 9 × DRV2605l included in "custom made PCB"
  • 2 × TCA9548A included in "custom made PCB"
  • 9 × G0832012 3€

View all 13 components

  • Update: Study completed, Master's thesis in progress

    Jakob Kilian4 days ago 0 comments

    Sorry for the big gaps in the project log. There is just too little time to document everything while working.

    So since the last post, there has been a second revision of my PCB. I changed some small things and added LEDs and a MEMS (gyroscope, accelerometer and magnetometer – to be able to turn of all motors when glove is not in regular position). It all worked out quite well, and just in time for the trial I had all 4 prototypes (2 sizes, one spare glove each) ready. You can have a look at the photos for details for now. Of course, I will upload material, blueprints and code as soon as I had time to put everything in order, probably in February when the thesis is finished.

    So after finishing the prototypes I was able to successfully conduct the study in August and September without any technical problems or other issues. I also attached some pictures from the study – thanks for the photos to Kjell Wistoff.

    Right now I am doing the transcription and analysis of the data gathered. In the next few months, I will write my master's thesis and submit it at the end of January. The thesis will be open access and therefore be uploaded as well. A paper on the study is also planned to be published. More on this as soon as there is solid news.

    So far for now...

    Total view of the system
    Battery worn on the upper arm

  • New Module with Raspi CM4 ready and working

    Jakob Kilian05/25/2021 at 13:09 0 comments

    Hey folks,

    a short post, because it is quite busy around here...

    I finished the first revision of my new module which now hosts:

    • the ToF camera (pico flexx)
    • a Raspberry Compute Module 4
    • my new motor-driver and CM-carrier board. 

    Now you only need to plug in a battery pack via USB-C and off you go!

    I will now test the unit to identify possible errors and pitfalls. Subsequently, I will carry out a pre-test with subjects, in which I want to gather first insights into the expected parameters in the obstacle course and concretize the preparation of the study...

    See ya!

  • Research Project Approved - Jay!

    Jakob Kilian03/19/2021 at 10:32 1 comment

    Hey folks! 

    Sorry for letting pass so much time since the last update. But now there is even more exciting news to report:
    I am very happy that now after some failed attempts at other institutes I am able to write my master thesis at the ZEISS Vision Science Lab (Ophthalmic Research Institute at the University Hospital Tübingen, Germany), from whom I also get a budget to conduct a small empirical study with 10-20 participants. Furthermore, as coincidence would have it, I was able to successfully apply for a grant to build the prototype tested in this study: The Kickstart @ TH Köln project (financed by federal funds) provides me with a budget to realize a revised version of the device. This way I can finally start to work through the list of shortcomings and get professional help for the trickier problems. 
    And no worries: the project stays open!

    The last few months, during which I have not posted any updates, I have spent little time on the device itself and a lot on defining the study design (and learning the necessary statistical basics for it), writing proposals, and soliciting formal bids for procurements. This has been exhausting at times, which makes it all the more exciting that everything is now in place and I can begin the actual work.

    So far for today. I will keep you posted when there is more to report!

  • August 2020 - Just a Short Update

    Jakob Kilian08/17/2020 at 14:39 1 comment

    Hi everybody!

    I just had the feeling, it is time for a new update on my project so that there is at least a post every few months. Because: the project is still running and I spend 1-2 days a week to push it further. Unfortunately there is not so much to show these days, as I am eagerly writing code on a complexity level way above my knowledge and skill set... 

    The main issue over the last weeks was to make the code more efficient (to save processing cycles and ultimately heat and battery usage...) and to let the calculations run concurrently. I therefore hat to rebuild and rewrite wide parts of my code which ended in quite a mess sometimes. However, I am nearly done now and I promise to push commits more frequently when the structure once has an acceptable form...

    Just some benchmark data:
    The whole processing of a depth frame now only takes about 3ms.
    The process of writing the values to the motor drivers takes about 5ms. 
    I can now easily reach the full 45fps on a Rpi4 OR let the cpu sleep for about 80% of the time if I use 25fps...
    The total latency is now at ~30ms – perfect for multisensory integration.

    I will keep you updated!

    stay save and healthy...

  • Working on the Monitoring App and MA Thesis

    Jakob Kilian04/27/2020 at 21:43 1 comment

    Hey folks. This is just a very short update to keep you in the loop:

    I am still working on this project – du to covid I do have even more time to put into the project. But at the same time I need to get the scientific part of my MA thesis done. Therefore I am eagerly reading literature and papers around the topic and preparing a little study with the device that hopefully takes place end of the year or so... No one knows how long this virus will take.

    For this I am preparing an app, that is remotely connected to the Raspi via UDP over Wifi, which allows me to see all the data, settings, bugs etc. This was an important step for me to boost the debugging process and to be able to support people with their first contact with the device (as I can see, what they "see" or feel...). So here is a small Screenshot to show you my work in progress... I am pretty happy about this already as the connection is stable at 25fps :-) 

  • Research Fund declined

    Jakob Kilian01/27/2020 at 18:17 2 comments

    Hey folks, 

    I am writing a short update on the project as there was no movement on a content but on a administrative level: I applied for a research fund (consisting of 10k €) and made it to the final round. Unfortunately I got declined so I again have to look for some more money to realize a empirical study about the device. mmmmh, could crowd funding work? 

    The good thing is: a lot of theoretical work is now already done due to the application and I still can use it for my MA thesis.. 

    I will keep you updated!

  • My project won a Hackaday Prize Honorable Mention!

    Jakob Kilian11/17/2019 at 11:50 2 comments

    The Hackaday Prize winners are out! And Unfolding Space won an honorable mention for "Best Concept".

    Once more I can express my admiration for the high level of quality and elaboration of the finalists' projects. I am super happy and I really feel "honored" (haha) to be among this year's winners.

    I definitely know where to spend money on and I am very happy to have this opportunity and privilege to continue to work on something that exciting and demanding at the same time.

    Thank for your support and also thanks to Supplyframe and the other Sponsors to make this happen.

    Cheers!

  • Updated Files and Schematics

    Jakob Kilian11/09/2019 at 18:42 0 comments

    I invested some time to make my files, schematics and instructions clearer. You can see the updated picture all over the page. Comment, if you spotted a mistake! This is my new general schematic (would you call it the same? Don't know if it is the right term...)

  • Last but not least: My Video Submission for the Hackaday Prize

    Jakob Kilian10/01/2019 at 13:47 1 comment

    An finally there is my submission video for the Hackaday Prize 2019.

    As always: feedback is very welcome!

  • Instructions Online Now

    Jakob Kilian10/01/2019 at 00:53 0 comments

    In another night shift I uploaded some Instructions. Certainly this won't be the last word, things are changing all the time and I am still about to figure out the best way to build the prototype... 

    But eh, now I have some fancy visualizations and you maybe can understand a little better what I did here ;-) Have fun with it!

View all 20 project logs

  • 1
    Download and View the Material Carefully

    Building and assembling the prototype entails considerable effort. But generally it should be feasable to hobbyists and makers as well as to professionals in electronics and hardware development. There is no doubt that many steps will differ from build to build as the used materials and the tools may differ. This guide therefore tries so give you an overview over decisions I made and the strategy I followed, when I build my unit. As I am very interested in your way and also in your questions and proposals for improvement: Comment here or drop me a line, when you start to work on this!



    Have fun with this guide, I hope it will help!



  • 2
    COMPUTING UNIT: Finf, Buy and Print Components

    Plenty of stuff is needed and some parts may need some improvisation as you maybe can't purchase them all over the world. 

    We're starting with the casing for the computing unit. Therefore you need:

    The Armband and the Adapter
    There are many offers out there for detachable sport armbands. You can find them in different variants. This one and this one should come closest to the one I used. You can also find the parts separate, like this adapter

    TNTOR Power Bank
    Might be difficult to find outside of Europe – at least I couldn't do so. It is a very thin (8.8mm) power bank which in general keeps it promises.

    Raspberry Pi 3b+
    You will best know where to purchase. I didn't test many other versions, but performance should be best on the newest... Raspi ZERO DOESN'T WORK! As is doesn't know NEON commands, used by the camera's library.

    The 3D printed Case itself
    You can find the 3D files in the files section. Hope you have access to a printer. Double check the measurements if your components fit!





  • 3
    COMPUTING UNIT: Slim Down the Raspberry

    To slim the computing unit down, I removed the USB ports, the Ethernet port and the "Display" socket and cut the end of the Raspberry Pi 3b+. Have a look at this and this tutorial!
    First the tricky part: unsolder and clean up the soldering joints. When the parts are removed, cut the raspberry along the green line: 


    Thanks zu Killar on Sudomod.com

    After the diet mine looked like this:

View all 9 instructions

Enjoy this project?

Share

Discussions

sanskar329 wrote 02/05/2020 at 12:55 point

So new code has to be developed or modifications can be done in your code?

please help

  Are you sure? yes | no

Jakob Kilian wrote 03/30/2020 at 13:05 point

Hey Sanskar. Basically the Code is running quite ok. Maybe performance could be enhanced and maybe some work could be invested in building an app for a smartphone to "check" on the status of the device, edit the settings and see the actual camera data.... 

It is kind of difficult to arrange team work here, as the physical device is needed to so some tests, right...?

best, 

Jakob

  Are you sure? yes | no

sanskar329 wrote 02/05/2020 at 12:21 point

Can we use a 2D camera and a distance sensor for this project?

  Are you sure? yes | no

Jakob Kilian wrote 02/05/2020 at 12:34 point

As I described in my previous answer I wouldn't go for a 2D camera as too much cognitive effort is needed to localise objects on a plain image in a resolution that small... But what could work as a basic setup, is to use 9 single pixel distance sensors (like the GY-53L1 – one for each motor) and position them onto the glove... 

  Are you sure? yes | no

sanskar329 wrote 01/27/2020 at 17:59 point

Is it possible to do this project using a basic camera ? or atleast a cheaper camera,which can make the project cost effective.What problems will it have though ?

  Are you sure? yes | no

Jakob Kilian wrote 01/27/2020 at 18:12 point

Hey sanskar,

basically you could! There are also some documented experiments with wearable lowcost camera devices on the internet. But I specifically choose the 3D camera as it is way easier to "find" an obstacle in these data than to find it in a b/w video stream. You have to interpret perspective, motion parallax, occultation, shadows, relative size etc. etc. So learning to substitute vision by interpreting classical visual data as tactile input might take a lot longer and might not be possible with just 9 pixels like my glove. 
I reckon, that the price for ToF Sensors or other 3D cameras (like realsense or kinect) will decrease over the next years, so that when this things is fully developed, they will just cost a few bucks. Hopefully I am not wrong...

cheers!

  Are you sure? yes | no

Michał wrote 10/13/2019 at 11:38 point

Well done! Great work.

Why you did not use opposite side of hand? It seems to be more natural, more sensitive, more receptors are there...

Go further... Scale up effectors - put wider matrix on clothes, shoes. So you get much better, continous, general, directional orientation in space, regardless of moving parts of system like gloves. Of course, vibration modules are nice but still bulky, so maybe try delicate(!) HV, electroactive polymer (shape changing fabric) or piezo something...😸 If this method may be practical it should be as printable/flexible as possible, fluently integrated with clothes. A big challenge but it is possible...

  Are you sure? yes | no

Jakob Kilian wrote 11/09/2019 at 18:45 point

Hey Michal, sorry for the late reply. Somehow I oversaw your post...

Actually I choose to use this side, because I wanted to maintain the full functionality of the hand. Using the vibration motors, that couldn't be guaranteed on the palm. Furthermore I wanted the sensation as easy to learn as possible and during my trials I found out that it would be easier to have the sensation on the side where the sensed object actually is located, which is the back of the hand if the hand is in a natural position.

Yes, that would be an interesting step to do research on! Still the hand has one big advantage: the degree of movement. In cognitive science it is believed that (new) sensory input can only be learned by movement (and relating feedback) – the more movement the faster the learning. In my first experiments this theory has somehow been confirmed: you could even shake your hand in various directions but you still get the relation to the real world around. 

But what I definitely want to do implement ist a sensor system on the ground level to identify small obstacles lying on the ground. This would be a great opportunity to test other parts of the body...

And last but not least (sorry for the long answer) I am working on electro stimulation. This could be a small, lightweight, scaleable and low-power alternative to the motors. I just wanted to start with a more easy way (motors) before I grill peoples hand with high voltages (which you need for stimulation, haha). 

What is HV? High Voltage? 

Thanks for the tips, I keep them in mind!

  Are you sure? yes | no

Brandon wrote 09/17/2019 at 20:06 point

Hey just found this looking through the Hackaday Prize entries!  LOVE this idea.  

So we're working on a Depth sensing + AI platform (aptly named DepthAI) that allows object localization in real-time.  

Would be more than happy to send you one if you think it would be useful for this.  My first thought is that it would be... as it gives information on what objects are, and where they are in 3D space, real-time.

So then this can be used to map vibrations to the hand through your system.

We're also a hackaday project, here: https://hackaday.io/project/163679-luxonis-depthai

Thoughts?

And thanks for making such a cool system!  So we once did something similar for radio waves... so you could search out transmitters through vibrations on your hand.  And also the Commute Guardian bike safety system, which is why we're making DepthAI, it's our end goal, will use haptic feedback to the biker as well as a warning.  It's for giving 'eyes in the back of your head' for bike commuters.

Cheers,
Brandon & The Luxonis Team!  

  Are you sure? yes | no

Jakob Kilian wrote 09/18/2019 at 12:23 point

Hi Brandon,

thanks for your nice words! 

Just checked your project, it is amazing and super sophisticated. I always thought about something like this to add to my very basic orientation device. But I did not have enough engineering skills to do so… So you are absolutely right, this would be a great combination and I’d be extremely happy to receive on of the shields or work together with you!

I am super busy right know, due to work, other stuff and also (luckily) due to the Hackaday Price. I’ll send you a PM so that we can stay in contact and have a chat latest in october.

Thank you in advance. And also props for your great project – we live in a very interesting time.

Jakob

  Are you sure? yes | no

Brandon wrote 09/18/2019 at 19:21 point

Hey thanks for all the kind words!  And yes definitely will keep in touch.  Will keep you posted on hardware and firmware progress.  

Thanks again,

Brandon

  Are you sure? yes | no

Jakob Kilian wrote 02/15/2019 at 12:53 point

thanks guys. Just wanted to add: the basic idea behind that is kind of old – check out Paul Bach-y-Ritas amazing studies back in the 1960s (https://youtu.be/7s1VAVcM8s8?t=147) and loads of others (keyword sensory substitution). New is the 3D tof technology and cheap hardware in general. This brings everything on a new level...

  Are you sure? yes | no

Josh Cole wrote 02/15/2019 at 06:38 point

This is such a cool idea! I've been thinking recently about how to use other sensory mechanisms beyond eye sight in order to convey information. This is next level though, and could be life changing for some people. I can't wait to see/feel where it goes next!

  Are you sure? yes | no

Malek wrote 02/12/2019 at 19:24 point

great work keep going

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates