• Hardware Decisions - Leaving the MUSE

    James P.09/18/2016 at 17:44 0 comments

    The original Typeface Demo was built using a commercial EEG reader - the MUSE headband. It's an amazing piece of hardware and anyone interested in EEG or Computer Brain Interfaces should definitely check one out if they get the chance. The reader was able to reliably detect blinks and allowed us to get a working demo of our idea up quickly.

    Nonetheless the device ended up imperfect for our purposes for a number of reasons:

    1. Although much cheaper than typical assistive typing interfaces, the MUSE headband still runs $299 and a lot of that cost goes towards unneeded functionality.
    2. Testers with large earrings, odd-shaped heads, or lots of hair had trouble reliably typing during the project demonstration.
    3. The MUSE is a closed source platform which does not jive well with our own beliefs in open hardware and also limits our ability to control the data which the device reports.
    4. The MUSE really only works on people's foreheads. Our hope is to build a generic platform that could be mounted on any muscle over which a paralyzed individual has retained control.

    As such, we intend to build our own hardware to connect with our software. At present we intend to try out the following technologies:

    1. Replace EEG with an EMG - EMG sensing can be vastly more affordable and more reliable than the electrodes involved in EEG. It's also less invasive and will not require the use of electrode pastes. The sort of unary singals we rely on do not require the nuanced differentiation EEG enables so there's no real reason to inflict the disadvantages and risks of EEG on our users. EMG is also better suited to helping us broaden the use-cases of our platform from just forhead-signals to generic anywhere-on-the-body detection.
    2. Experiment with IR instead of Bluetooth and keeping signal processing hardware-side. By using an IR transmitter/receiver pair we can greatly decrease the size, power requirements, and expense of the device. Some of our testers also expressed trepidation about attaching wrapping a bluetooth device around their cerebrum. Given that our platform is built around reducing input complexity to a single signal, IR may be a good fit for our purposes. Normal concerns with IR communication (sunlight, directionality, etc.) are less of a concern as the device will be used indoors and by a user who necessarily maintains line of sight with the UI.
    3. Look into building a USB-receiver that exposes a keyboard interface - allowing users to type into any application instead of just the Typeface communicator.

  • Origins and Interfaces

    James P.09/18/2016 at 17:32 0 comments

    Typeface began, like many projects, as a hackathon entry. In at Georgetown University's first MLH-sanctioned hackathon.

    Originally, the hope was to build a brain-computer interface using EEG to allow users to play a simple spell-casting game we'd dreamed up. However, as we worked with the Muse headband - a commercial EEG product - we realized there was potential to build something more useful than just a game. Instead, our goal became to allow a user to type with as few muscles as possible.

    The biggest challenge here was developing a unary typing interface. With a typical keyboard, there are dozens of potential inputs (keys) that a user selects from as need. Even in Morse code, one must both maintain consistent pacing and communicate two inputs (shorts and longs).

    Eventually, we recalled a story about how Vietnam Prisoners of War managed to communicate with each other using what is known as "tap codes" - a simple method of transmitting messages through a series of taps that does not require either party to learn Morse Code.

    The way tapcodes work is pretty straightforward. Imagine a grid of letters like the one below:

    ABCDE
    FGHIJ
    KLMNO
    PQRST
    UVWXY/Z

    To communicate the letter "N" a user would tap four times (the number of columns from the left), pause, and tap three times (the number of rows from the top). In this way, any message can be typed out simply by communicating their coordinates on this grid. The system isn't quite as efficient as morse-code but it's easily learned and reliable.

    The typeface UI was inspired by this protocol. Instead of requiring a user to "group" their signals into rows and columns we used a timer to make up for some hardware challenges whereby signals could be miss-counted. The timer highlights individual columns on a grid and, when the user blinks, it switches to highlighting rows. When the user blinks again the character at the intersection between the row and column is typed on the screen.

    In this way, one can type nearly any message simply by blinking at the desired times.