A new device that allows a computer to change the user's emotional perception of reality.

Similar projects worth following
All augmented reality devices so far provide an interactive experience of the real-world environment that is enhanced by computer-generated perceptual information. In contrast, the EmotiGlass project explores ways in which a computer can modulate the user’s EMOTIONAL perception of reality. Our projects aims to develop the first “Modulated-Emotion Reality” device. EmotiGlass enables completely new applications in the field of augmented reality in which emotional biases can be manipulated by computer applications. Additionally, EmotiGlass has therapeutic applications as an aid to help control stress and anxiety.


By: David Prutchi and Jason Meyers

All augmented reality devices so far provide an interactive experience of the real-world environment that is enhanced by computer-generated perceptual information.  In contrast, the EmotiGlass project explores ways in which a computer can modulate the user’s EMOTIONAL perception of reality.  Our projects aims to develop the first “Modulated-Emotion Reality” device.  EmotiGlass enables completely new applications in the field of augmented reality in which emotional biases can be manipulated by computer applications. Additionally, EmotiGlass has therapeutic applications as an aid to help control stress and anxiety.


Does the sight of a dog make you happy or fearful?  Are you comfortable around people of other races?  We commonly think that the emotions aroused by events around us are the results of our biases and prior experiences.  This is very true, but there is much more than meets the eye when it comes to understanding the way in which our brains attach emotional content to our perceptions.

We develop implantable cardiac medical devices for a living, and a few months ago, we came across an interesting journal article that caught our curiosity.  In this paper [Azevedo, et al. 2017], researchers from the University of London and their colleagues showed that presenting an image at different periods of the cardiac cycle would cause changes in the way that subjects would emotionally perceive that image.

The paper reported that racial biases could clearly be modulated by changing the timing at which an image is presented.  In this study, pictures of dark-skinned or light-skinned individuals holding various objects were presented to coincide with either the heart’s contraction (cardiac systole) or relaxation (diastole).  Results showed that if the image was presented during cardiac systole, subjects significantly misidentified innocuous objects as weapons when they were held by dark-skinned people.  Remarkable!

A group from University College London conducted a related study [Gray, et al., 2009] through which they found that sensory processing depends on when stimuli are experienced in relation to heartbeat timing.  Specifically, the perception of pain – which is strongly biased by emotion—varied depending on when during the cardiac cycle noxious stimuli were delivered to the subjects.

The mechanism behind these effects seems to be that the emotional arousal caused by a stimulus depends on signals being received by the brain from the body’s internal blood pressure sensors (baroreceptors).  Simply timing the delivery of stimuli against the cardiac cycle can change the way in which the brain processes stimuli!

It’s not only the timing component of a stimulus that can change the emotions that it arouses.  The spatial component also has major effects on the emotional perception of visual scenes.  In his exciting book “Of Two Minds: The Revolutionary Science of Dual-Brain Psychology”, Dr. Fredric Schiffer [1998], a Professor of Psychiatry at Harvard Medical School and Attending Psychiatrist at McLean Hospital, showed that very distinct emotions can be evoked by selectively blocking access of the visual field to one of the brain’s hemispheres.

Spatio-temporal changes in the visual field also cause deep changes in emotional response.  EMDR (Eye Movement Desensitization and Reprocessing) is a known psychotherapy technique whereby causing side-to-side eye movements is believed to unlock a memory mechanism that can be used to reprocess traumatic events.  Outside the therapeutic setting, mild side-to-side sweeping of the visual field appear to decrease the emotional impact of distressing events. The objective of the EmotiGlass project is to develop a pair of active glasses that can be controlled to selectively...

Read more »

EmotiGlass Breadboard Schematic Rev1.1.pdf

Schematic for EmotiGlass Breadboard

Adobe Portable Document Format - 116.79 kB - 08/27/2018 at 05:50


EmotiGlass Prototype Schematic Rev1.pdf

Schematic for EmotiGlass wearable prototype

Adobe Portable Document Format - 99.48 kB - 08/27/2018 at 05:50



Arduino sketch to run EmotiGlass breadboard prototype

ino - 21.32 kB - 08/22/2018 at 20:03



Bluefruit configuration code (from Adafruit)

h - 3.24 kB - 08/22/2018 at 20:03



Packet Parser for Adafruit Bluefruit

cpp - 3.72 kB - 08/22/2018 at 20:03


  • 2 × DS1, DS2 DOGM128E-6 High-contrast, 128x64 pixel LCD supertwist display
  • 2 × Left/Right Liquid-Crystal Shutters Small Liquid Crystal Light Valve - Controllable Shutter Glass, Adafruit Product number 3627
  • 1 × R6 22k
  • 1 × R7 100k trimmer
  • 4 × R1 - R4 10k trimmer potentiometers with shaft

View all 15 components

  • Wearable Prototype Progress: First Print

    Jason Meyers09/17/2018 at 23:09 0 comments

    The CAD model for the wearable prototype has progressed, and a first rev of the main frame piece has been printed.  The frame is being designed with the front and 2 sides as separate parts.  This allows the front piece which holds the glass LCD displays and shutters to be printed from a stiff plastic (PLA for now, but I may eventually switch to a CPE family material so I don't have to worry about leaving the print in a hot car), allowing it to provide the most protection possible to the glass.  At the same time, the sides can be printed from a more flexible material from the Nylon family, which will allow the Emotiglass prototype to fit on heads of different sizes (hopefully) without being uncomfortable.

    As soon as I completed the model for the front frame piece, I wanted to print it because I was expecting to need to make some adjustments to the fit.  This part takes about 11 hours using a .4mm nozzle in an Ultimaker 2+.  Sticking to this smaller sized nozzle makes it possible to remove the support material without too much hassle.   After clearing the support and a few minutes of filing, I was able to test the fit of the components:

    The fit on the display and shutter came out very well for a first test, needing adjustments of less than a millimeter.  The fit on a face, on the other hand, wasn't as good.  We tried holding the part on a few different faces, and concluded that the bridge needed to move down significantly.  Additionally, I've flattened out the area above the bridge in the hopes that that may improve comport:

    As the second rev prints, focus shifts back to the PCB layouts.  The initial outlines are transferred from this model into Altium for PCB layout to see how everything fits.  There is room to extend the PCBs downward somewhat, so they'll be adjusted as needed.  The PCB shapes are shown in purple in the following image.  2 screws (not shown in the model) will secure the boards.  The holes are sized to support either imperial or metric hardware - a #2x1/4" or M2x6.  The holes are sized for standard (cheap) machine or sheet metal screws.

    With the front cad close to done, focus will shift back to the electronics.  I'll swing back to the cad model and finish the sides after the boards are ordered.

  • Wearable Prototype Schematics and Simplifying the Hardware Design

    Jason Meyers08/27/2018 at 06:18 0 comments

    The schematics for the wearable prototype have been uploaded, with each sheet corresponding to a small PCB with a portion of the EmotiGlass circuitry.  Sheet 1 shows the right side PCB, which makes the connections between the Feather and the rest of the system.  Sheets 2 and 3 are the front PCBs, which mount to the front LCD displays.  They contain the support components necessary for the LCD Displays, connections for the side LCD shutters, and connections to other parts of the system.  Sheet 4 shows the development PCB, which contains the control pots and hardware PWM circuitry.  This board will be mounted on top of the feather, which is mounted on top of the right side PCB.

    Currently, the side LCD shutters are driven by a dedicated hardware PWM circuit.  DC will degrade the shutters over time, so they must be driven by a circuit with no net DC.  A standard PWM peripheral is not able to do this, but it appears that the TCC modules (Timer/Counter for Control) in the Atmel SAMD21 processors (the ARM Cortex M0+ based microcontroller used by this feather) will be able to generate balanced PWM drive signals with no external circuitry.  The firmware module for this will be written while we are waiting for PCBs to be manufactured. 

    The control pots are present primarily for convenience during prototyping.  As development advances, their function will be replaced eventually by software.  For this reason, the pots and the hardware PWM circuitry have been placed on a separate board.  This board can be installed for initial work with the wearable prototype, and then removed once the firmware and software have advanced sufficiently.

    The completed wearable prototype schematics allow work on the PCB layouts to proceed together with the mechanical design of the frame, which is currently partially complete.

  • Short videos of EmotiGlass breadboard prototype in operation

    David Prutchi08/23/2018 at 21:00 0 comments

    Following are three short videos showing the basic functionality of the EmotiGlass breadboard prototype.

    We added some backlighting (using a strip of EL material) to show the way in which the device causes selective occlusion.  The user's eyes would be on the side of the EL material looking at the scene through the liquid-crystal panels.

    First is the breadboard operating in Baroreceptor-Synchronized Occlusion mode:

    The pulse sensor is on Jason's finger.  We can control the occlusion pattern as well as the timing of the occlusion versus the plethysmographic signal.

    Next is the breadboard being controlled to produce a window for lateralized brain stimulation (we are showing functionality of controls, rather than actual occlusion that would be used for lateralized stimulation):

    Lastly, the sunglasses mode (David casts a shadow on the light sensor):

  • Wearable Prototype Plan and Mechanical Design Start

    Jason Meyers08/22/2018 at 22:58 0 comments

    With the breadboard operational, the next step is to develop a wearable prototype.  Our plan is to 3D-print a frame with a shape similar to traditional glasses which will hold all of the components while being (reasonably) comfortable to wear.  The LCD displays and shutters will fit into grooves in the frame which will capture them on 3 sides.  The front LCD displays will each be mounted to a small PCB which will then attach to the frame with 2 screws, securing those displays in place.  The side LCD shutters will be retained by a small bracket.  A carrier PCB for the Feather PCB will be mounted above the right ear, and the battery will be mounted above the left ear. 

    Before modeling the frame, it was necessary to develop CAD models of the major components.  The LCD display and LCD shutter models were built using the datasheets (where possible) and plenty of measurements of the actual components.  The board outline for the Feather was exported from Eagle, and blocks representing keepout zones for the larger board components were added from measurements. 

    A 3D guide sketch for the glasses was developed based on measurements of my face and of a pair of cheap sunglasses which were a giveaway at last year’s Maker Faire.  This is only a starting point, as it will probably take a few iterations to make the shape and size feel comfortable.  The major components were arranged on the guide sketch so that modeling of the frame could begin.

  • Breadboard Schematic Uploaded

    Jason Meyers08/22/2018 at 22:01 0 comments

    The complete schematic for the EmotiGlass Breadboard has been captured and added to the project files.

  • EmotiGlass Breadboard Prototype Code v9.0

    David Prutchi08/22/2018 at 20:01 0 comments

    The code is running (but may still contain some bugs and leftover debugging/engineering lines), and control is via BLEusing theBluefruit Control Pad running on the iOS Bluefruit app.

    Here is the prototype Arduino code to run the EmotiGlass breadboard:

    *  ------------------------------------------------------  *
    (c) 2018, David Prutchi and Jason Meyers
    Licensed under the MIT License
    Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated
    documentation files (the "Software"), to deal in the Software without restriction, including without limitation
    the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software,
    and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
    The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
    Runs on Adafruit Feather M0 Bluefruit
    Uses two DOGM128E-6 LCD displays made by "Electronic Assembly" and two small liquid-crystal shutters
    driven by two MCP4725 digital-to-analog converters through DC-AC converter.
    Control is via Bluefruit Control Pad running on iOS Bluefruit app
    This version implements:
      Mode 1 = Lateralized occlusion
      Mode 2 = Cardiac-synchronized lateralized occlusion
      Mode 3 = Automatic light-controlled sunglasses 
    v.9 Adds control of two MCP4725A1 DACs to control contast of side shutters
    #include <Arduino.h>
    #include <string.h>
    #include <SPI.h>
    // DOG128E-6 LCD display
    #include <dog_7565R.h>  //Library for DOG128E-6 displays (dog_7565R is the LCD controller)
    // Adafruit BLE
    #include "Adafruit_BLE.h"
    #include "Adafruit_BluefruitLE_SPI.h"
    #include "Adafruit_BluefruitLE_UART.h"
    #include "BluefruitConfig.h"
    // Asafruit MCP4725 digital-to-analog converters
    #include <Wire.h>
    #include <Adafruit_MCP4725.h>
    *  Pin connections between LCDs and M0 Feather Board:             *
    * ------------------------------------------------------------    *
    *  LCD           Arduino                                          *
    *  ---           ---------------------------------------------    *
    *  40 (CS)       Specific to LCD:   Right Front 10, Left Front 12 *
    *  39 (Reset)    5                                                *
    *  38 (A0)       9                                                *
    *  37 (SCL)      SPI SCK                                          *
    *  36 (SI)       MOSI                                             *
    Available functions in dog_7565R (LCD controller) Library:
      void initialize  (byte p_cs, byte p_si, byte p_clk, byte p_a0, byte p_res, byte type);
      void clear       (void);
      void contrast    (byte contr);
      void view        (byte direction);
      void string      (byte column, byte page, const byte *font_adress, const char *str);
      void rectangle   (byte start_column, byte start_page, byte end_column, byte end_page, byte pattern);
      void picture     (byte column, byte page, const byte *pic_adress);
      Use of rectangle
      Description:Draws a filled rectangle on the screen. The filling is given through the pattern byte
      Name:void rectangle (byte start_column, byte start_page, byte end_column, byte end_page, byte pattern);
      Vars:start column (0..127 / 0..131), start page (0..7 / 0..3), end column (0..127 / 0..131), end page (0..7 / 0..3), pattern-byte to fill area
    //Define operating mode
     Mode 1 =...
    Read more »

  • EmotiGlass Prototype Breadboard Working!

    David Prutchi08/22/2018 at 19:39 0 comments

    We are in the process of cleaning the schematics and software, so we'll upload them a bit later.  For now, here is a picture of the breadboard:

  • Changed light sensor part and added components

    David Prutchi08/22/2018 at 19:02 0 comments

    The light sensor part was changed to the ALS-PT19 Analog Light Sensor Breakout Board (Adafruit 2748).

    Added parts to the parts list for the breadboard to include passives and components for the H-bridge and oscillator that drive the side liquid-crystal panels with AC.

  • Core components for EmotiGlass Breadboard Prototype

    David Prutchi08/22/2018 at 03:16 0 comments

    A practical implementation of the EmotiGlass concept requires a pixelized light shutter with the following characteristics:

    • Spatial resolution of at least 2mm
    • Variable opacity and high contrast
    • Response speed in the tens of milliseconds or better

    After researching possible solutions for fully pixelized electro-optical elements, we settled on modular LCDs for the front lenses. We chose the DOGM128E-6 made by Electronic Assembly GmbH (Zeppelinstraße 19, D-82205 Gilching, Germany) because it allows full control of the pattern of visual occlusion, as well as the opacity level of the occluded and non-occluded areas. 

    The DOGM128E-6 is sold by Mouser and Digikey (around $23 each). The DOGM128E-6  is a high-contrast, 128x64 pixel LCD supertwist display with 15μm dot gap.    It incorporates a ST7565R controller with SPI interface.  The display module requires just a single 3.0 to 3.3V power supply (270μA typical) and can be directly mounted on a PCB.  The LCD is compact (55x46x2 mm) with a large viewing area of 51x31 mm.

    For the side panels we chose the “Small Liquid Crystal Light Valve - Controllable Shutter Glass” distributed by Adafruit as Product number 3627 ($2.95 each).  The viewing window is 31x33 mm, while the panel size is 36x36x2 mm, making it suitable to mount on the sides of goggles based on the DOGM128E-6.  The device is a transmissive twisted nematic panel and requires an AC drive voltage in the 3.0V range.

    To make a first breadboard prototype of EmotiGlass as open-source as possible, we chose the controller to be an Arduino IDE-compatible Adafruit Feather M0 Bluefruit LE (Adafruit Product ID: 2995, $29.95).  Power will be derived from a 3.7V 150mAh lithium ion polymer battery (Adafruit Product ID: 1317, $5.95) that can be recharged directly from the Feather.

    For the breadboard prototype chose a ready-made plethysmography sensor (Pulse Sensor Amped, Adafruit Product ID: 1093, $25.00).  ALS-PT19 Analog Light Sensor Breakout Board (Adafruit Product ID: 2748) is used for the “automatic sunglasses mode”.  
    A simple oscillator and analog H-bridge will be used to drive the side panels at an amplitude set by two MCP4725A1 DACs are used to control the side-panel shutters (MCP4725 Breakout Board - 12-Bit DAC w/I2C Interface, Adafruit Product ID: 935, $4.95).

View all 9 project logs

Enjoy this project?



Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates