Close
0%
0%

2020 HDP Dream Team: UCPLA

The 2020 HDP Dream Teams are participating in a two month engineering sprint to address their nonprofit partner. Follow their journey here.

Similar projects worth following
The Challenge-

Universal Wireless Remote Control:

As a result of the symptoms associated with cerebral palsy and other physical challenges, individuals are accustomed to interfaces like joysticks, touch pads, and remotes with large buttons.

Teams are tasked with designing a new type of universal remote that meets the needs of the physical challenges described, and integrates with modern electronics like smart TVs, workstations, and digital devices in as few steps as possible.

The Team:

Ruben Kackstaetter
Embedded Software Engineer, with background in Electronics
Frederick, Colorado, USA

Nataliya Kosmyna
Ph.D in Computer Science, specialization in Brain-Computer Interfaces
MS in AI and HCI
Boston, MA, United States

Kelvin Chow
Mechanical and Biomedical Engineer
Toronto, Canada

THE PROBLEM

Even though assistive technology (big buttons, joysticks, etc.) is meant to help individuals with cerebral palsy interface with digital devices, they are still difficult to use and lack utility due to the limited integrations with electronics.  Commercial assistive technology is expensive and limited in options, meaning the generic models are not suited for each individual and their unique circumstances.    

THE GOAL

To develop an affordable, inclusive, “universal” assistive technology platform for individuals with cerebral palsy (and other conditions) to help them easily access the digital world.  The word “universal” can be viewed from two perspectives. 1) The platform can be used by all individuals. 2) The platform can be used to control any electronic device. 

Though these goals are unrealistic in the timeframe of a 2-month project, the specific aim of this project was to lay the groundwork of an open source project that can be developed further to achieve these goals.  

THE DREAM

To close the digital accessibility gap between the general population and the cerebral palsy population.  With this project, we want to give individuals with cerebral palsy greater individuality and be more capable of making positive contributions to society.

PROJECT MILESTONES

MILESTONE 1: Initial Abstract Proposals (July 17, 2020)

MILESTONE 2: First User Testing Session (August 17, 2020)

About a month into the project, the initial prototypes were shipped to UCPLA for initial testing.  Shortly after, we had our first user testing session over Zoom, which we were really grateful for and allowed us to quickly realize our project oversights up until that point.  A follow up testing session was conducted a week later to gain further feedback regarding our initial prototypes.  

MILESTONE 3: Final Presentation to UCPLA Panel (September 18, 2020)

MILESTONE 4: Final Prototype Delivery (TBD)

UCPLA Dream Team Concepts - HDP 2020.pptx

Initial Project Proposal Presentation Slides

presentation - 17.07 MB - 09/17/2020 at 18:51

Download

UCPLA Dream Team Abstract Concepts - HDP 2020.docx

Initial Project Proposal -2 page summary

document - 65.00 kB - 09/17/2020 at 18:51

Download

UCPLA Dream Team Final Presentation.pptx

Project Final Presentation Slides

presentation - 18.87 MB - 09/17/2020 at 18:49

Download

BOM Gesture Remote - v1.xlsx

Updated Oct. 18, 2020

sheet - 12.42 kB - 10/18/2020 at 08:07

Download

CAD Files Gesture Remote - v1.zip

Updated Oct. 19, 2020 File types: Solidworks(2016), Onshape, STEP, STL

x-zip-compressed - 3.58 MB - 10/19/2020 at 20:56

Download

View all 6 files

  • Using Gesture Control for Roku

    Kelvin Chow09/30/2020 at 07:18 0 comments

    In the last log, I was experimenting with integrating Roku into our platform, using a ESP32-based "breadboard keyboard" for Roku remote emulation.  In this log, I am going to discuss replacing the input control with the IMU-based gesture control prototype.  I will also detail the hardware updates made to the gesture control remote, which was just shipped to UCPLA a few days ago! 

    Hardware Updates to the Gesture Control Remote

    Up until now, the gesture control prototype was the M5StickC microcontroller.  The limitations that needed to be addressed were 1) increasing battery life, 2) adding a user-friendly switch, and 3) adding a vibrating motor.  These details were discussed in a previous log and developed further here.  One change in direction  was made from that previous log to this one.  Instead of using a reed switch, I decided to stick with a tactile switch for switching device states.  After reading more thoroughly about the hazards of rare-earth magnets for devices like pacemakers and hearing aids, it was better to play if on the safe side and make sure nothing we give to the user is dangerous.  

    Below are some images of the electronic components as well as the 3D printed enclosure (more detailed documentation is being prepared and will be uploaded soon).  

    (a) resistor (b) tactile switch (c) 1 Ah Lipo battery (d) diode (e) vibrating motor (f) protoboard (g) MOSFET (h) M5StickC PCB
    All electronics were soldered to the green protoboard before mounting on the orange M5StickC PCB.  The M5StickPCB had a few header pins sticking out from one side that made it easy to mount in a compact space (see previous log).  
    Device interior showing the new PCB mounted within the 3D printed enclosure.
    Device exterior showing the large, easy to press button. The device on the right is a "dongle", which has a few purposes.  It receives the signals from the gesture remote and processes it before relaying commands to other IoT devices.  Second, it is used to calibrate the gesture remote for each user.  Third, it displays some information on the small LED screen, including the state of the device (top 3x3 array) and a battery indicator (bottom row).  

    Software Problems: ESP-NOW and MQTT Doesn't Work Simultaneously

    While transitioning from the breadboard keyboard to the gesture control remote for using a Roku, I ran into the biggest technical issue faced in this project.  With the breadboard keyboard, a button press would publish a MQTT message, which would be received by the Raspberry Pi running Home Assistant.  However, when I tried to integrate the gesture remote with MQTT, the ESP-NOW communication between the gesture remote and the dongle stopped working.  I spent quite some time debugging this and looking around for answers, but couldn't get it to work.  I also tried replacing one of the communication methods with bluetooth or painlessMesh, but wasn't happy with either approach because messages were sometimes missed.  

    In the end, I settled for a workaround where I have an ESP32-based board plugged into a USB port of the Raspberry Pi.  The ESP32 board communicates with the dongle through ESP-NOW, waiting on a message, which is then relayed to the Raspberry Pi through serial communication to trigger a home automation action.  

    Raspberry Pi/Home Assistant Hub workaround.  Adding an additional ESP32-based board to the platform, physically connected to the Raspberry Pi.  This grey device is programmed to receive messages via ESP-NOW from the dongle and sends messages via serial communication to the Raspberry Pi.  

    Looking at this problem again, I think the question comes up: Why is the dongle necessary?  Why can't the gesture remote handle the signal processing and directly send MQTT messages to the Raspberry Pi.  There were two initial...

    Read more »

  • Stepping into the Digital World: Roku Remote Emulation

    Kelvin Chow09/06/2020 at 06:48 0 comments

    Previous project developments have heavily focused on physical hardware prototyping of a keyboard and motion-based remote controller.  One of the tasks that haven't been explored in depth is the peripheral digital devices that we can control with our prototypes.  We have discussed some ways to control a Roku media player as a proof-of-concept starting point and in this log, I will detail one method using an ESP32 microcontroller through Home Assistant to emulate a Roku remote.  

    Simple ESP32 Keyboard

    While we are working on a more developed keyboard prototype, I created a low-fidelity one for the purposes of debugging.  This is shown below.  It is 8 tactile switches connected to an input pin on the controller.  The uploaded sketch publishes a message (UP, DOWN, HOME, etc.) to an MQTT broker, which will be used as a trigger to perform an action on the Roku.  

    Home Assistant on a Raspberry Pi

    There are many possibilities on how we can communicate from a microcontroller to the Roku.  I chose to use Home Assistant because it was simple to setup, easy to learn, and has integrations with the majority of smart devices.  This last point is important because it makes it easy to translate from controlling a Roku to controlling another device like an Amazon Echo or Google Nest. 

    Setup Procedure

    The setup to install Home Assistant on a Raspberry Pi was followed from the Home Assistant website.  After, the Roku and MQTT broker integrations were setup for this specific example.  

    Roku and Mosquito MQTT broker integrations configured, shown on the Home Assistant dashboard.

    The next step was to establish a connection between Home Assistant to the Roku media player.  Tests scripts were written where a button on the Home Assistant dashboard could be used to send Roku commands.

    Left: Two sample scripts shown of up and down roku commands.  Right: Script execution buttons on Home Assistant dashboard. 

    Once that was done, the ESP32 microcontroller needed to send messages to Home Assistant.  With the MQTT broker configured, short messages were published to HA, which could be seen from the Home Assistant dashboard.  

    Screenshot from Home Assistant Dashboard MQTT broker configuration page.  Waiting for button presses on the ESP32 controller, and receives messages (HOME, BACK, SELECT, DOWN).  

    Finally, an automation routine was written to combine the trigger from the microcontroller and to perform actions on the Roku media player.  Two simple automation scripts are shown below.

    Two automation scripts shown.  First script waits for the message, LEFT, to trigger the action of pressing the left button on the Roku.  

    The video below shows the ability to use the 8 buttons to emulate Roku remote to control a Roku media player.  

  • Upgrading Hardware of IMU Motion Sensing Remote

    Kelvin Chow08/26/2020 at 08:32 0 comments

    Currently, we've been using the M5StickC platform out of the box as our first iteration of our IMU-based motion sensing remote.  At its price point of 10 dollars, it included a lot of components in a small package, great for wearable device development.  Below, I summarized a table of the different features of the M5StickC, highlighting the useful components for an IMU remote.  

    FEATUREUSEFUL?NOTES
    ESP32-PICOYESlow-power microcontroller
    6-AXIS IMU (MPU-6886)YESmain feature of this universal remote
    Real Time Clock (RTC) module (BM8563)YESImportant for deep sleep functionality of ESP32 chip
    Power Management Unit (PMU) (AXP192)YES
    3 TACTILE SWITCHESMAYBEDifficult for the user to press these buttons
    IR LEDPROBABLY NOT
    RED LEDMAYBECan be used as an indicator light (i.e. low battery)
    HEADER PINSMAYBEmodularity, add more things custom for each user
    MICROPHONENO
    LED DISPLAY (0.96")NOsmall display not suited for this population, high power consumption
    LIPO BATTERY (85 mAh)YES/NOBattery capacity needs to be increased to be useful

    The first prototype was helpful in gaining useful insight for our first user testing session.  The second iteration will include some additional features to address making the device wireless by having a larger battery, incorporate haptic feedback, and adding a non-contact button for turning the device on/off.  

    INTEGRATING NEW FEATURES

    My initial thought was to use the header pins and attach a second housing to the orignal M5StickC platform, which is seen from some of the hats from their website.  With this approach, I tried to imagine what the updated device would look like and could not envision a compact design.  I also didn't like the lack of robustness of using the header pins for connecting the additional features.

    My next idea was to open up the device and see what the guts looked like.  I ended up breaking some parts including the LED display and one of the PCBs.  Regardless, there's a picture of the main PCB below.  

    M5Stick PCB.  On the topside, we see the small battery connections as well as 10 header pins to a second red PCB for external connections (top left box).  I had to break this PCB in order to detach it from the original housing.  On the backside of the PCB, there are some more components such as the LED display and the IR/red LEDs.   

    This main PCB had all the main components except including the battery connection and decided to just mount a second board with all my additional features onto the header pins, similar to how the red PCB was attached.  This implementation will be shown at the bottom of this log. 

    LARGER BATTERY FOR PRACTICAL APPLICATIONS

    The current 85 mAh battery allowed a runtime of about 20 minutes, not really useful for any applications.  Thus, for our first prototype that we shipped out, we instructed our users to keep the device plugged in to a wall/computer.  

    I measured the current consumption of just sending IMU signals to a receiver to be about 120 mA.  I hooked up a 400 mAh battery to the M5StickC and got just over 3 hours of runtime, which correlated to the current measurement.  When I set the device to deep sleep mode, it drew about 10 mA, which is higher than the 10 uA of current draw in deep sleep mode of the ESP32 development board.  This is something I still need to look into.  I also measured the current while the device was turned off and was surprised to find it drew about 7 mA of current.  After looking at some online forums, it seems like this wasn't a mistake and is a bug of the platform.  For this application where I'm not requiring a device to run for months or years, I think the current consumption at off/deep sleep states are good enough for now.  

    I ended up choosing a 1,000 mAh battery which would extrapolate to about 8 hours of runtime of just sending IMU signals.  However,...

    Read more »

  • First User Testing Session With UCPLA

    Kelvin Chow08/23/2020 at 19:48 2 comments

    Last week, we had our first user testing session with UCPLA where we got our first chance to see our prototypes in the hands of our target audience.  We were lucky enough to see this firsthand remotely through Zoom and got some valuable learning lessons.  A few screenshots from this testing session are shown below.

    Walkthrough of our system. Shared screen mirrors the 7" display in front of the user, currently showing the homepage. Two video feeds on the right side - one facing the keyboard/display and one facing the user.
    Testing the keyboard device to navigate through the maze game.
    Testing the wearable IMU input device to navigate through the maze.

    This was the first time our devices were in the hands of other people, and it gave us valuable insight into some considerations we didn't focus on.  Early on, we asked a few questions to help us better understand our audience, but watching someone try our devices allowed us to better grasp some of these issues.  

    1. Designing for the Caregiver: Almost all of our focus was on the user itself, but there's also a setup procedure involved with our system.  There were a few hiccups during the setup where some tasks took a bit longer than necessary.  Some of this could be attributed to difficulties communicating through Zoom, some due to doing this trial for the first time, and some could be due to improper user workflow.  Before we deliver the final prototype, we need to eliminate that third issue and have proper documentation on how to setup the system.   
    2. Visual Impairment (Keyboard Device): We were told early on that about 50% of users didn't have line of sight, but it wasn't something that was easy to understand.  This test gave me a better idea of what this meant.  We had a display with the home screen near the keyboard, where the buttons on the keyboard spatially corresponded to images on the home screen layout.  My assumption was that the delay for pressing a button was largely due to difficulty controlling one's arm.  For this one user, it looked like the delay was just as much visual impairment than it was motor control impairment.  First, the user needed to look at the display, then look at the keyboard to recognize which button needs to be pressed.  Then, the user needed to move their hand and figure out where it was in relation to the target button.  It became a more challenging task than previously thought, and during the maze game with the keyboard device, there was quite a number of wrong buttons being hit.  This is for one user and I'm not sure how much this applies to other users, but it is something to consider for future user testing and future design revisions.    
    3. Calibration (IMU Device): There were struggles with the IMU-based device as well.  Before using the device, the user needs to go through a calibration sequence.  Selecting the correct calibration positions was one thing, and the other thing was remembering those calibration positions during use.  During usage, the controller had a difficult time distinguishing between left and right positions, while up and down positions had no issues.  This issue was created by choosing the wrong calibration positions, and we need to figure out which calibration sequences will make it easier to accurately select different buttons.  A lot of the issues here are to be expected because this is a foreign device that these users have never tried before and a learning curve is necessary.  What was most encouraging was seeing how engaged the user was while using the IMU-device.  By involving physical movements with a game, it was positive reinforcement to keep trying even though there were a lot of mistakes being made.  During this session, physical therapy was brought up as a potential use for this system.  

  • One Big Button + Joystick/Keyboard Updates

    Kelvin Chow08/17/2020 at 08:46 0 comments

    The last couple of weeks were busy preparing for prototype shipment and I kind of forgot to post log updates, so this entry will cover a few topics.  While this upcoming week is focused on user testing and receiving feedback on our keyboard and joystick prototypes, this log will focus on the development of an intermediate prototype of a wireless Bluetooth button.

    STARTING SIMPLE WITH ONE BUTTON

    Currently for this project, we were focusing on delivering two prototypes: the keyboard and the IMU-based joystick.  I wanted to add an intermediate milestone to create this wireless button prototype, which could potentially be a third deliverable but more importantly serves as learning lessons for future development of other prototypes.  Below, I describe 5 discussion topics regarding this one button.

    1) Cost of Existing Big Buttons for Assistive Technology

    A lot of buttons and switches that are catered towards use as "assistive technology" are in my opinion, really expensive.  A big red button can cost about 65 dollars.  A mechanical keyboard switch that we are using for our keyboard input device is more than an order of magnitude less that at $1.06.  Going one magnitude further, a tactile switch costs $0.10.  From an electrical/hardware perspective, they are all just momentary push button switches.  Looking at a previous Hackaday project, the Clunke Button, that project addresses this issue by creating a DIY assistive button using a keyboard switch, an audio cable, and some 3D printed parts.  

    Now taking a look at wireless Bluetooth "assistive technology" buttons, the cost is also expensive with this model priced at $195.  I looked at this button, and challenged myself, could I make a button by only having $19.50 to spend on parts?  Below are the list of electronics from Digikey to make a wireless Bluetooth button, with the subtotal coming in at $19.36.

    QuantityPart #DescriptionUnit PriceExtended Price
    11965-1000-NDESP32 Board$10.00$10.00
    2CH197-NDKeyboard Switch$1.06$2.12
    4SY628-NDAA Batteries$0.33$1.30
    136-2478-NDBattery Holder$1.79$1.79
    1EG5619-NDRocker Switch$0.57$0.57
    1CP1-3513-NDStereo Jack$1.42$1.42
    11175-1491-NDStereo Cable$2.16$2.16

    With these parts, 3D printed parts, and some wires and solder, a Bluetooth button with an stereo jack to connect another "Clunke" style DIY button can be made at almost 1/10th of the cost of a commercial alternative, and 1/3rd the price of a just a button.  Below is a snapshot of the CAD model with a section cut out and an image of trying to get all the electronics to fit inside the 80x65x52.5 mm housing.  

    CAD model of button and housing. Rocker switch turns on battery mode. 3 audio jacks included (10 total ports available) for connecting external large switches.  Access to usb connection to upload new firmaware or to run button of wall power.  Most of the housing weight and size is taken up by the batteries, which could probably be reduced.  
    Opened up housing with battery compartment taken out. Not a lot of spare room if 10 audio jacks are used. I made a mistake for this model, and there's some interference where I can only install up to 6 jacks, 3 on each side, for this model. 

    So with $20 worth of electronic components, we can make a $200 button that has Bluetooth connectivity to be used with our platform we're developing.  This leads to the next topic, why have just a simple button instead of the other cool devices we're developing?

    2) Asking users to deviate from their unique behaviours and tendencies

    The keyboard and IMU-based input devices we're developing aren't necessarily commonplace in the assistive technology market.  Even if we're designing these devices that we think have added functionality or an improved user experience, it doesn't mean our devices are going to be adopted by potential users.  My guess is that there will be a large population...

    Read more »

  • Package Delivered

    Ruben08/15/2020 at 01:42 0 comments

    Aragna From UCPLA Has Received The Package

    We had a meeting to go over all of the content and have Aragna set everything up to be sure it was all working. While we did have everything he needed in the box, I soon discovered that I should have labeled all of the cables.

    Our goal for the final version is to eliminate all of the cables.

    Another lesson learned was that I should have double checked the address before shipping. At second glance I would have noticed that the address was missing some information (apt #). Even so we lucked out and the package was still delivered in time for our meeting.

    Also, I need to do better at tacking screen shots during a session.

    Trying to Demo over a Web Meeting 

    Other than the slight confusion of which cables to use where (The joysticks came with three USB-C cables), The demo went well. Once challenge when showing a demo is trying to align the camera to give your viewers a good view of the items you are using.

    One way we were able to kind of work around that was by having me screen share my remote connection to the display unit. That way the team could see what the device was doing as Aragna was testing it out.

    Over all I say it was a success!

    We already have a meeting lined up with our first user this Monday.

  • Concept Prototype Ready To Ship

    Ruben08/10/2020 at 20:28 0 comments

    M5stickCs and Atom ESP32 as cordless joysticks

    At the moment these will be used to gather more user data, to help design a full on wireless joystick.


    Raspberry Pi Computer with display running our Demo Software

    Running our Universal Remote Demo Software to show the concept of how we plan on assigning many actions to only a few buttons. The UI will match the button layout, then each button on the custom keyboard will correspond to button in the UI. (To test this with a keyboard, use the following keys

    Keyboard Keys

    Custom Adaptive Keyboard Test Kit

    It is simply a custom keyboard with only 14 keys. The design is more about giving our potential users a functional test kit, so they can feel the difference between the keys and decide on which button type, size, trigger force, and recess level they prefer. We also modified two of the buttons by adding a much stiffer spring. (I didn't get the chance to measure the actual force so the values are guesstimates.)

    Switch Key

    Cherry MX Switch Table

    Switch Actuation
    Force
    Linear Tactile Clicky Used For
    Red 45g Yes No No Gaming
    Brown 45-55g No Yes No Typing/Gaming
    Blue 50-60g No Yes Yes Typing
    Black 60g Yes No No Gaming
    Green 70-80g No Yes Yes Space bar
    White 75-80g No Yes Yes Typing
    Grey 80g No Yes No Typing/Gaming
    Linear Grey
    80g Yes No No Gaming

  • Keyboard Matrix Scanning

    Ruben08/06/2020 at 18:09 1 comment

    Early in my college years I came across this very useful and web based animated circuit simulator, falstad.com/circuit. Since I'm a hands on and visual learner, this tool really helped me grasp the behavior of electronic and logic circuits.

    While I was working on the firmware for our concept prototype keyboard, I thought it would be fun to create an animation of how keyboard matrix scanning works.

    Click on the image or here to visit falstad.com/circuit and try it out for yourself.

    As you can see the micro-controller drives the rows low one at a time and reads the columns as inputs. If both the row and the column are low, then a key-press is detected.

    This allows us to read key presses without needing a single input for every single key! In this example we are able to detect key presses for 15 keys with only 8 GPIO pins. Almost all keyboards or TV remote controls follow the same principle.

    You can also check out our Arduino code for the keyboard test-kit at the Github project.

  • IMU Joysticks V2: Calibration Routines and Flag Semaphore

    Kelvin Chow08/02/2020 at 23:02 1 comment

    In the first log of this project, I experimented with IMU sensors as alternative input methods to large buttons and mechanical joysticks.  Along with the smart remote prototype we are working on delivering to UCPLA for testing, we want to also deliver an IMU-based input controller as well.  In this log, I will discuss the software updates making the joystick more robust and usable for different users.  

    Previous Iteration

    In the first iteration, the code was quickly written to make things barely work enough to demonstrate a concept.  For example, raw IMU signals (accelerometer only) was used and threshold values were hardcoded specific to how I wanted to use it.  Three improvements are made.  First, the raw IMU signals were processed to smooth out the signals to give better results.  Second, a calibration routine was programmed and a new thresholding technique was used to detect different position states.  Last, a two joystick controller was shown to demonstrate modularity and the ability to expand input capabilities.  

    The hardware did not changed from the previous iteration.  All of the processing is done with a receiving unit, an ESP32 based board.  Two M5StickC units ( with an IMU sensor, button, and screen)  send signals to the receiver.  In this iteration, one M5StickC unit is intended to be worn on a user and the other unit is intended for the caregiver for handling calibration of the system.  

    Lightweight Signal Processing for Smoothing Data

    The IMU in the M5StickC is a 6-DOF IMU, with 3 accelerometer signals and 3 gyroscope signals.  With this IMU, all I am trying to do is measure tilt of the unit.  The first iteration measured tilt only using accelerometer signals, which is not the most accurate way to measure tilt angles.  The problems with measuring tilt from accelerometer signals is any sort of movement including vibration will cause noisy signals.  

    Tilt angle can also be measured using signals from a gyroscope, with signals measuring the angular velocity.  Calculating orientation from gyroscope can be done by integrating the gyroscope signals.  The issue with doing the angle drifts from the correct orientation over time.  

    The most common technique to calculate tilt angles using IMUs is using a Kalman filter to combine accelerometer and gyroscope signals.  However, I opted to use a complementary filter which is stupid simple to implement and doesn't need a lot of computation.  It combines both methods together to mitigate both noisy signals and IMU drift.  Some more detailed info on complementary filters and IMUs in general is found here, which I followed to implement for this system. 

    The image above compares the three methods to measure angles based on accelerometer and gyroscope data from one of the IMUs.

    Adding a Calibration Routine 

    The next improvement that was necessary was adding a calibration routine to make this system more robust.  Considering that each user with cerebral palsy has different limitations, they won't all be able to use this input device the same way.  Thus, a calibration routine is necessary to stay within their motion limits.  In my head, the plan that makes sense is to involve a caregiver for setting up the IMU.  The IMU will be strapped onto  an extremity with decent range of motion.  The caregiver will then instruct the user to move in different body positions with each position corresponding to a different button.  The caregiver will have a second M5StickC unit, which will give them prompts on a display and a physical button to press to help with calibration.  Once calibrated, the information will be written onto the flash memory of the receiving microcontroller to store until the IMU is re-calibrated again.  

    Once calibrated, the IMU is ready to use immediately. ...

    Read more »

  • Smart Remote Button Sensitivity

    Kelvin Chow07/28/2020 at 07:32 3 comments

    As Ruben alluded to in a previous log entry, one of the first proof-of-concept prototypes we want to deliver for testing is a smart remote with an array of 15 buttons.  Depending on how a user decides to hit a button, a parameter we want to play with is how much force or effort it takes to hit a button.  For me typing on a keyboard, I have pretty good control of all my fingers and won't mistakenly hit the wrong keys.  In my case, I would want something with low resistance and low force.  If we are designing for somebody with less motor control, mistakenly hitting wrong buttons (false positives) is a concern with regular remote control designs.  One way we can avoid hitting wrong buttons is changing the button sensitivity to make them harder to press.  

    The main button type we have keyed in on are mechanical keyboard switches.  Advantages of using these buttons are they are extremely common to find, come in a standard size, can withstand millions of cycles, and have a relatively low profile compared to other options we were considering (arcade buttons, limit switches, etc.).  We were also thinking of making our own switches if we wanted a low profile button, but at least for this first prototype, it's not the highest priority. 

    For a mechanical switch where the button is either on or off, how do we change sensitivity?  If we had an analog "button" like a force sensor, we could change the sensitivity on the software side by simply adjusting the pressure threshold.  Similarly for a mechanical switch, the way we would do this is by changing the spring stiffness inside the key.  Cherry MX sells a bunch of different types of key switches with varying parameters such as actuation force (sensitivity) and key travel distance.  We have ordered 9 different types of buttons to use in our test panel to vary some of these parameters, including button sensitivity.  However, after thinking about it some more, the actuation force of different key switches really didn't vary too much.  From their website, all of the keys' actuation force lied within the range of 45-80 gF.  Even though we ordered a bunch of different keys, the actuation force didn't span a broad enough range.  We decided to first open up a key switch to see how feasible it was to replace the spring with one that was stiffer.  

    Disassembled keyboard key switch. Components from left to right: 1) Bottom housing with electrical contact pins 2) spring 3) actuating component and key cap mount 4) top housing.

    It turns out these keys are really easy to open and are relatively simple to disassemble with only 4 components, one being the spring.  By replacing the spring with another with a different spring stiffness or length, the hope is that we can have much stiffer keys switches that is at least an order of magnitude higher than the original switch.  As a quick test, two springs were pulled out from ballpoint pens.  Below is an image of two longer and stiffer springs, compared to the shorter spring from the key switch.

    Top spring is the original keyboard spring and the bottom two are from ballpoint pens. Factors affecting spring stiffness include wire diameter, coil diameter, and number of turns.

    Each of the springs were installed inside the key switch to test the actuation force required with each spring  To estimate this force, a kitchen scale was used.  The first test was with the original short spring that came with the key switch and it required about 50 gF to actuate.  The rated force from the website is 45 gF, so this was pretty good validation for using this kitchen scale for measuring force.

    Makeshift actuation force measurement setup with a kitchen scale.

    The next two videos show testing of the springs from the ballpoint pens.  For the longest spring, I had trouble closing the housing, meaning I probably exceeded the fully...

    Read more »

View all 28 project logs

View all instructions

Enjoy this project?

Share

Discussions

Jesse Koester wrote 10/06/2020 at 19:19 point

Out of curiosity where are the watch straps obtained?

  Are you sure? yes | no

Ruben wrote 10/08/2020 at 22:00 point

The watch straps came with the M5 Stick Kit

  Are you sure? yes | no

Jon wrote 07/27/2020 at 13:15 point

Hey guys, looks like you are making great progress! I came across a custom macro keyboard project that might give you some additional ideas.

The original project was posted in German at https://maker-tutorials.com/gamepadmacro-pad-mit-mechanische-cherry-mx-tasten-3d-drucken-arduino-pro-micro/

You can also find the parts for printing at https://www.thingiverse.com/thing:2593692

Good luck!

  Are you sure? yes | no

Ruben wrote 07/28/2020 at 18:05 point

This is Awesome!

  Are you sure? yes | no

Josh Starnes wrote 07/21/2020 at 03:40 point

hello guys, I hope your project is going smoothly. I wanted to reach out and let you know your welcome to utilize my UCPLA adaptive BLOB controller design. It is meant to be 3D printed. I am sure with positioning of momentary switches you could utilize it if you wanted. Good luck!

  Are you sure? yes | no

Ruben wrote 07/21/2020 at 17:04 point

Thanks for your input, we'll be sure to look into your BLOB controller design at https://hackaday.io/project/172039-ucpla-rgb-mood-expression-light-controller

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates