Close

Production Horizon

A project log for Perceptoscope

A public viewing device for mixed reality experiences in the form factor of coin-operated binoculars.

bensaxbensax 11/11/2017 at 08:460 Comments

After a long and exhausting journey of systems integration, testing, and coding, we're ready to put the latest prototypes out into the world and gear up for production early next year. Lots of exciting locations and partnerships are in the works that I'll be sure to update on as the final details come together. Since August, I've also been deep diving into the organizational sustainability and community aspects of the project as a National Arts Strategies Creative Community Fellow.  

In the immediate future, I'm excited to announce you'll be able to experience the latest Perceptoscope in person at the DTLA Mini Maker Faire on Saturday December 2nd. It should be a lot of fun to share what we've been up to with the Maker scene here in LA. 

And now for a quick tour of the final pre-production prototype...

Special thanks to Dan and the rest of the gang at the Design Lab for helping me with this final push on fabrication.

On the systems front, the big changes to note are we've bumped up to high dpi 2k resolution display which proved to be surprisingly challenging based on our original combination of components. After some experimenting are now using the NVIDIA Tegra Platform to power the project. We can easily deploy on both ARM or x86 systems now, and so it really will come down to what makes the most sense for a given deployment in terms of power consumption and cost. 

The Tegra TK1 is a beast of an ARM system, and I've been really impressed by the graphics performance for such a relatively inexpensive board. Moving up to the TX1 or TX2 could open up an even more impressive array of possibilities as we integrate additional computer vision and machine learning algorithms into the platform. 

Also worth noting is the electronic shutter that now gives easily variable control of the light entering the optics. Up until now I had been manually adjusting variable ND filters in front of each eye depending on ambient lighting conditions, which was both tedious and unsustainable in a permanent deployment. Currently a simple circuit controls the shutter with a potentiometer at the bottom of the viewer, but the idea will be to use an additional light sensor to dynamically adjust the shutter as necessary. 

Buttons and DPad on the handles are a combination of Adafruit PiGrrl Zero boards with a MCP23008 port expander -- the idea is to use easily accessible off the shelf components throughout the system so we can scale up quickly.

I finally had a chance to design some boards on the Othermill, and felt really empowered by the process. This is a breakout board I designed to interface all of the sensors throughout the Scope with the AdaFruit feather form factor we adopted for the microcontroller. Going with the feather system has made it super straightforward to add things like motorcontrollers or small status displays. Eventually we'll design our own custom boards top to bottom, but working with preexisting components will let us move much more quickly.

We wanted a microcontroller with great performance and a lot of interrupt pins to get the best resolution we can out of the optical rotary encoders, and went with the much loved PRJC Teensy 3.2 in a feather adapter. So far we've been really happy with both the latency and rotational resolution we're achieving with the whole system. 

It's a strange feeling to look back at how far the project has come since it was just a bunch of taped up cardboard in my apartment. Even with all the work that's gone in so far, it really feels like I'm just now getting started on doing the work I set out to do. 

Discussions