In 2016, upwards of 1.6 billion people in the world are using smartphones. Smartphones are becoming a staple piece of technology for many citizens in North America and around the World. Smartphones and touchscreen devices enable users to navigate around their city better, to communicate with others more freely, and to operate applications which can promote happy and productive lives. Currently, there exist a limited number of practical devices for quadriplegics to use touchscreen devices - this is where we step in.
The LipSync is an electronic device which allows quadriplegics the ability to use compatible touchscreen and computer devices without the use of their hands. The user is able to manipulate a cursor on their device screen using a mouth-operated joystick with integrated sip and puff controls to simulate the actions of "tap" and hitting the back button, respectively. With longer sips and longer puffs, additional secondary features are enabled including a "tap and drag", "long tap and drag" and the possibility of more specialized functions as per the user's needs.
The LipSync is design specifically for portable devices, it does not require AC power, but it will work with any device including desktop and laptop computers that support mice through a universal serial bus (USB) or the Bluetooth connection.
The LipSync is an open-source hardware project where all of our 3D printer files, component lists and microcontroller code are made public. In the spirit of accessibility, our housing can be 3D printed, the electronic components are readily available and the assembly is as straightforward as possible.
The LipSync was envisioned as a holistic solution that takes in to consideration not only the interface but how the system is to be mounted on the user’s wheelchair. The actual electronics of the device is on part of the implementation. There are no standardized methods to which wheelchairs are designed. Wheelchair manufacturers can use round or square tubing. They often also use tubing which is not compatible with other manufacturers so that customers must buy accessories from them. As a result, there is not a standard location or clamping mechanism to mount assistive technology on the chair.
Wheelchairs are also customized to the user, including the height, width and seating position of the user. The seating on the wheelchairs are customized to minimize the incidents of pressure sores. As a result, the mounting system for assistive technology such as the LipSync also has to be customized. In addition to the instructions to assemble a LipSync, we have included instructions on how to mount a LipSync. A combination of off-theshelf and custom 3D printed components are provided in order to help makers create a fully integrated and customized solution for the user.
Now the maker community and disability community can meet, collaborate and work together constructing a LipSync over a period not much longer than a weekend. We hope these new relationships will continue spurring innovation within the maker community.
There are 3 main aspects we will be addressing in our project:
- Developing an easy to build, but robust electronics assembly that novice to experienced makers can build;
- Developing a device housing which can be 3D printed by makers either at home or at dedicated facilities;
- Creating mounting options for a variety of wheelchairs with 3D printed parts and/or commercially available components.
SMARTPHONES AND ACCESSIBILITY
Smartphones, by their very nature, are intended to be used while on the go. Traditional assistive technology designs for the desktop and laptop computer are not portable in nature, so they cannot be easily applied to smartphones. While single and dual input systems exist for smartphones, they are slow and frustrating to use for users with more capabilities of movement. Single and dual input switches are appropriate for users that can only make one or two movements consistently. The LipSync is designed to allow users with more physical capabilities to interact with smartphone interfaces quickly and efficiently. The LipSync is intended for users with a good range of motion of their head and the ability to sip and puff with their mouth. These actions allow the user to move a cursor around the screen and select (tap) an object on the screen.
The LipSync takes the approach that the simplest solution is the best solution. It took the concept of the Jouse (which we developed in the late 1980’s) with the goal to make it portable and not intrusive so it can be used with smartphones. With the advent of affordable additive manufacturing devices, such as 3D printers, it also takes the approach that assistive technology does not have to look industrial and medical in nature.
HISTORICAL BACKGROUND OF MOUTH OPERATED INPUTS
The introduction of graphical user interfaces, such as on the Windows operating system, into mainstream computer systems in the late 1980s required a dramatic shift in the way assistive technology was designed to interact with computers. What was once was a straightforward task to just enter letters and numbers (under the DOS operating system), GUIs required the user to now move a cursor on the screen and be able to pick points of interest on the screen in order to interact with the icons on the system. The Neil Squire Society designed one of the first alternative interfaces to computers in the late 1980s to respond to that shift. The Jouse was one the first ways for users with severe mobility impairments to move a cursor around the screen using a joystick gripped in the user’s mouth with a pneumatic switch that allows them to either sip and puff to activate the left and right selection keys. The original Jouse was not designed to be portable and required a connection to AC power.
Smartphones are similar to desktop and laptop computers in the design of their interface. They require that the user interact with graphical elements on the screen. Most smartphones use touch screen interfaces that people with fine motor control issues cannot interact with. The traditional mouse based interface is a better interface for users with fine motor control to use to interact with smartphones. The mouse interface is still the dominant interface on personal computers and many alternative pointing devices have been developed for computers. Derivatives of the mouse interface such as the joystick, foot control mice and head and tracking systems exist for PC. The challenge is those alternative interfaces are not portable or require extensive hardware to implement.
People with mobility impairments have a range of abilities. Those with mild mobility impairment may just have restricted ranges of motion of their hands and wrist due to pain or weakness. People with moderate mobility impairment may only have the ability to move their arms and legs, but lack the fine motor ability to control their fingers. Users with severe mobility impairments lack the ability to move their arms, wrists and fingers. As such they need to use other ways to interact with devices. Single and dual switch interfaces exist for users with the most severe of physical abilities, but the interface can be slow and frustrating for users with more physical capabilities such as the ability to move their head.
Feedback we received on the Jouse over the years, in addition
to feedback from early prototypes of the LipSync, were integrated into the final
design. The LipSync design process included the feedback of end users at
critical stages in the design process. Features such as cursor speed control
independent of the smartphone device (which may have no speed control and drag
modes for the cursor, critical for click and dragging elements, have been
integrated into the design.
FUTURE LIPSYNC PROJECT GOALS
- Oct-Dec 2016: refine designs and usability with user testing
- Jan-Mar 2017: host a “buildathon” where students, makers and volunteer teams will form around people with disabilities to build and mount LIpSyncs
- Apr-Dec 2017: Support a network of makers to connect directly with people with disabilities and creating 150 Lipsyncs through the Pacific Northwest
- Jan-Dec 2018: Scale maker-disability matchmaking service across North America, to support 1,500 Lipsyncs being created at the local level, as well as developing and applying other assistive technologies.
- In addition, the Neil Squire Society has been working on a service delivery model for the technology. It will soon be launching the Makers Making Change initiative. Makers Make Change has the goal of creating one-on-one relationships between makers and people with disabilities. The Makers Making Change initiative will provide the infrastructure to connect makers, open assistive technologies, and people with disabilities in their community to make a local impact. Makers will be able to meet people with disabilities in their community and work with them to deliver a solution.
If you want more information on the Lipsync, track our development closer or work with us in creating (or getting) a LipSync, please connect with us. We have a newsletter and request form on our webpage: http://www.neilsquire.ca/lipsync
This is an open source hardware project and adheres to the Open Source Hardware (OSHW) Statement of Principle 1.0 and Open Source Hardware (OSHW) Definition 1.0 found at [http://www.oshwa.org]
LipSync by Chad Leaman is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Permissions beyond the scope of this license may be available at firstname.lastname@example.org.