Concept design Details

The Challenge:

According to the World Health Organisation, over 5% of the world’s population suffers from disabling hearing loss. That means that over 360 million people are at risk of not being able to communicate effectively with those around them. In turn, the Deaf often struggle to gain access to employment and to interact socially with their peers. Research conducted in Bulowayo, Zimbabwe suggests that children with severe hearing impairments not only have low reading levels and limited vocabulary, but also that they often feel a sense of isolation and loneliness (Mpofu and Chimhenga 2013). These challenges are also well documented here: http://www.who.int/mediacentre/factsheets/fs300/en/.

Through the application of innovative technologies, I believe that the challenge of effective communication between hearing and Deaf communities can be overcome. That is why I am taking on the challenge of developing and producing a device that can record and translate the hand motions used in South African Sign Language. Because I am from South Africa, I will develop the device with local Deaf communities in mind. It will however be made to be easily transferred to different contexts.By translating Sign Language in real time, the device will allow Deaf people to easily communicate with those who do not know sign language and vice versa. The technology can also be used by non-signers to help them learn Sign Language. It will be especially valuable for parents and educators of Deaf children.

In order to add value to the everyday lives of both Deaf people and those with whom they interact, this technology will have to be accurate and integrate seamlessly into their daily routines. The intention is also for the device to be worn all day. Given these aims, the device must meet the following criteria:

My Background

At this point in time the team consists of myself, Ben McInnes. I am a Mechatronics Engineer living in Cape Town, South Africa. I completed my BSc in Mechatronics at the University of Cape Town. My final year project was to develop a digital glove that could recognise gestures to control mobile robots, this project led to my Masters which was to improve the glove and use it for the purpose of Sign Language Translation. Unfortunately due to time and budget constraints I was unable to develop the glove as far as I would have liked and the Hackaday Prize has inspired me to pick up where I left off and to continue development! While my initial design from university was the basis for this idea, the new device will use a completely different method for capturing the hand data. Also this implementation will go farther than my previous project and do true sign language translation!

The First Attempt

During my studies I designed a glove that used a 9 DoF IMU and flex sensors. The IMU was used to measure the hand movement as well as the orientation and the flex sensors (10 in total) were used to measure the flexion of the PIP and DIP finger joints. An Arduino based MCU was used to collect and transfer the captured data, via bluetooth, to a PC where the data was cleaned and put through a basic neural network. The data would be classified into different gestures.

While the idea and the glove worked well enough, the device was large and unwieldy at times. Flex sensors came with their own set of problems, they have a tolerance of up to 20% and wear and tear would eventually affect the results. Another issue that was that if the glove did not fit the user’s hand well then the sensor readings would be affected significantly. Due to the nature of the glove needing to fit very tightly, it was not easy to put on or take off. These are the sorts of problems that I hope to rectify with my new design.

The New Concept:

The idea is to design and build a device that can translate sign language in real time. These days there have been many attempts at this but I believe that one of the major flaws is the actual glove part of the device. Gloves can be uncomfortable, ill fitting and if made badly they can reduce the effectiveness of the sensors. They can also be annoying to put on and take off.

I believe that by using IMU’s we can do away with the glove and have a simpler design that can be easy to put on, comfortable and will not get in your way. The idea is that there will be an IMU on each finger, these IMU’s will be placed in rings that will fit onto your fingers’ middle phalanx bone. The rings will not be complete circles and so they will be “clipped” on to each finger. I believe that this would be far quicker than putting on a glove. There will be a 6th IMU positioned on the back of the user’s hand. This will be the base unit that will control the device. It will house the MCU, battery, bluetooth as well as any additional circuitry such as indicators and buttons. The main board housing will require several iterations in order to make sure that it meets the criteria described above.

The Deaf Community

While developing this idea I have made contact with some people that have ties with local Deaf communities, one of them being DeafSA . They are eager to see how this project turns out and I don’t plan on disappointing them! Hopefully with their help I will have access to people that are fluent in South African Sign Language. This will be a huge benefit as we will need plenty of data to train the models needed for the translation.