• Testing Microinteraction Designs

    nathan.matsuda06/03/2016 at 14:54 0 comments

    One of the core design considerations in this project is the way the LED ring is used to indicate different system states to the user, which in turn influences the requirements for the final hardware.

    Microinteraction design has been discussed at length, particularly with regard to mobile application interfaces - in order to get a satisfying clicky feeling on a touchscreen device you are mostly constrained to tweaking the animations following a button press.

    In our case we may not have the open-ended possibility of arbitrary animated graphics, but the NeoPixel ring does allow some flexibility in testing a variety of alternatives.

    On the hardware side, we'll aim to have the simplest LED setup that will support the finalized animation. Using WS2812 LEDs isn't ideal from a power standpoint because the integrated controllers each draw around 2ma, even when the LEDs are off. I could throw in a transistor to cut the power when the LEDs are not in use, but there might not be any need for RGB LEDs anyway. The big benefit of the WS2812s is the single GPIO pin requirement. Of course there are a variety of multiplexing schemes out there, so depending on the number of LEDs in the final design the power savings over using WS2812s will probably be worth it. Ultimately the minimum LED setup will depend on the desired animation set.

  • 3D Printing

    nathan.matsuda05/30/2016 at 00:27 0 comments

    Just wrapped up an initial print for the case.

    Aside from a few cutouts internally, I didn't end up needing to do much fettling to get things to fit.

    Next up - wiring up the 2nd unit and blocking in the actual interaction sequence.

  • Temporary Case Design

    nathan.matsuda05/27/2016 at 20:29 0 comments

    Even though this first iteration using off-the-shelf parts is a lot bulkier than it needs to be, we can still test the out a scaled up version of the intended form factor. This 3-piece case consists of a chassis with attachment points for a strap or wristband, a clear diffuser cap, and an overside button that attaches to the tact switch.

    After a test fitting an initial print, these CAD files will be going into the repository. Between this and the Adafruit BOM anyone should be able to reproduce the setup.

  • Hello World

    nathan.matsuda05/27/2016 at 20:10 0 comments

    Following this example, I threw together a NodeMCU quick script that blinks the LEDs and pulses the buzzer when a message is published (on button push) or received. On the PC side, I have Mosquitto running as the message broker and I'm using mqtt-spy to easily monitor and publish to topics for debugging.

  • Parts Test Fit

    nathan.matsuda05/27/2016 at 01:57 0 comments

    Parts arrived from Adafruit:

    Since there are few connections ( 1 pin + gnd + batt for the WS2812 LEDs, 1 pin + gnd for the button, 1 pin + vcc for the buzzer), I opted to tie everything directly to the header pin holes on the Feather board. With the help of some double-sided tape this is already a the basic device:

    I'll be cramming this into a 3D printed shell for now; later versions will consist of a compact single custom PCB.

  • Development Tools

    nathan.matsuda05/24/2016 at 15:08 0 comments

    There is a large ecosystem of IoT development tools by now, which means it extremely easy to get up and running with hardware like the esp8266. In this particular case, we will use the following hardware prototyping products from Adafruit:

    Communication between the call buttons and the host computer will take place over the MQTT message passing protocol. On the ESP8266 side, MQTT is a built in package in the NodeMCU embedded development environment (MIT license). On the host side, we will use MQTT broker Mosquitto for message distribution (EPL/EDL license).

  • Initial design considerations

    nathan.matsuda05/23/2016 at 23:34 0 comments

    This project will attempt to address an immediate challenge my wife and I are facing - how to avoid yelling from room to room while we try to take care of our newborn baby and maintain some semblance of continuity in household upkeep. This means we want to build a working solution immediately, with available tools and a minimum amount of effort.

    With this in mind, our strategy is to use (or misuse) widely available, well documented Internet-of-Things development tools to rapidly develop a non-internet Thing. Using these tools will have the added benefit of enabling more flexibility to A/B test variations on the basic user interaction, as well as remotely debug the system. Once the system operation is satisfactory, we can pursue a later design phase with more appropriately targeted hardware.

    Here's a sketch of what we want:

    - User A presses a button to get User B's attention

    - User A's device lights up to indicate that the call has been sent, while User B's device lights up to show that a request has been made

    - User B then presses the button to acknowledge the request

    - Both devices show the acknowledgment

    After initiating the initial call, User A can also press again to cancel the call. This whole sequence works from both devices so that either user can call the other.

    While this system will likely end up using some sort of low power RF connection (such as a LoRa link) directly between the two devices, we will be implementing this on top of an ESP8266/MQTT stack. Here is a rough system diagram:

    While this approach makes a number of concessions to unnecessary system overhead, during a rapid prototyping phase this extra firepower can be useful. In particular, the need for a wifi network and host server looks problematic at first glance, but will allow updates to be pushed seamlessly to each device, and message traffic to be monitored and debugged.

    We hope by building the system in this manner we can arrive at something that works for our immediate needs very quickly (within days hopefully) and support an iterative design process to refine the product.