05/24/2016 at 21:51 •
To understand the basics of how a Perceptoscope works, it's first important to recognize that there are two primary functional components when it comes to augmented and virtual reality devices beyond the computer that powers them: optics and tracking.
In many ways, the optics are the most obvious bit people notice when they put on a headset. VR researchers came to find that magnification of small screen could provide a wide field of view to the end user. By allowing the screen to render two side by side images which are magnified by lenses in front of each eye, the modern VR headset was born.
However, tracking is really the unsung hero of what makes modern VR and AR possible, and the ways in which a particular headset does its tracking are critical to its effectiveness.
Early modern HMDs like the Rift DK1 relied solely on accelerometers and gyros to track the movement of the headset through space. This provided a decent understanding of a headsets rotation around a fixed point, but could not really track a person as they leaned forward or backward. Current holdouts still using this approach are mobile phone based devices like Google Cardboard and Gear VR.
The next generation of devices combined those standard motion sensors with computer vision techniques to track lateral movement of the headset within a volume of space. The Rift DK2 uses infrared LEDs and an IR camera attached to your computer, similar in concept to a WiiMote in reverse. Headsets like the HTC Vive use an array of photosensors on the HMD itself which take note of how synchronized lasers move across them from mounted emitters throughout a space called "lighthouses".
Perceptoscopes take an entirely different approach to tracking which is based on a human sense known as "proprioception". Essentially, proprioception is the inner image we all have of our bodies orientation through touch. It's the way we can stumble through a bedroom in the dark, or touch a finger to a nose with our eyes closed.
Perceptoscopes are not a wearable extending the human body, but a physical object present in the space. This fixed and embodied form factor, with a limited range of motion along two-axes (pitch and yaw) allows us to use rotary encoders geared to each axis to record a precise angle that a scope is pointing.
So while traditional HMD tracking techniques can be thought of as an analog to the inner ear and eyes working in combination, Perceptoscope's sense of orientation is more like how dancers understand their bodies' shape and position in space.
We've since gone on to incorporate other sensor fusion and computer vision techniques in addition to this proprioceptive robotic sense, but this core approach significantly reduces computational overhead while simultaneously providing an absolute (rather than predictive) understanding of a Perceptoscope's view.
As for what makes our optics special, that's for another post entirely.
05/30/2016 at 03:18 •
After defining the terms of what a Perceptoscope should be, and the broad strokes for how it would work, I went into a discovery phase of optical experimentation. I bought and took apart basically anything I could get my hands on with optics, and stayed up late into the night examining how combinations of lenses could modify a view of a flat image or the world itself.
Some of my earliest prototypes were sacrificed and torn apart in the chaos of experimentation. They were large and cumbersome, and I strained my eyes trying to make the stereo image converge in the space in front of me.
I eventually figured out the right combination of lenses to bring into focus both the outside world as well as the near eye screen, and waited patiently for the military surplus prisms I ordered to arrive.
Once I had everything in one place, I built a cardboard prototype of my entire optical system out of a Dremel box, and started to work with a script in Processing designed to mimic a stereoscopic display.
I started build a larger box for the optics out of a shoebox to hold this cardboard prototype so I could start playing with throwing objects further into the space.
Now that I was satisfied with the layout, I designed a housing for all the optics in Rhino, and sent it off to be 3D printed.
There's still a lot of experimenting and math to be done to really perfect the system, but it also has some unique optical qualities that I never could have anticipated. For one, the way in which the outside world is brought into view actually gives users a wider angle than their typical sight. This effectively exaggerates the amount of depth between objects, and makes the outside world feel more three-dimensional. Even without the augmented reality compositing, Perceptoscope's optics were something which felt distinctly different from the everyday.
As we push the optics of the system further, it'll be fun to discover what other qualities emerge, and how much more impressive the system can be when we experiment with the ability to zoom in and out, or dynamically adjust to outside lighting conditions.
05/30/2016 at 05:18 •
With the optics produced and rendering concept working, it was now time to make the system self contained. Initially, I used usb and hdmi extension cables to tether the scope to my computer, but that wouldn't work in a public deployment. Figuring out what computer should go inside the system was key.
Because I was anticipating long deployments in areas of little to no power access, I initially worked towards using an ARM processor based system. My brother Adam had done some work previously in thin clients for digital kiosks, and gave me an ODROID X2 to play with.
Going with the ODROID meant we'd focus on Android as our OS, and though I had initially done some early prototyping pulling sensors from an Arduino into Unity through USB, we needed to go a different direction for ARM, particularly if we wanted to use the shaders required for stereo VR rendering.
We started playing around with Jmonkey, an open source Java game engine, thinking that a Java approach would translate well to Android. With a little nudging we got there, integrating an Arduino with sensors providing positional information to the virtual camera of the Jmonkey VR scene. I had to write a decent amount of the particulars of the renderer myself, as VR distortion shaders were just starting to make their way into the engine, but we couldn't have gotten there without the larger effort of the open-source community.
Along the way I started building a more rugged shell out of a cheap little camera case, and used Actobotics to create the geared axles and mounts for the encoders.
We were now ready to get the scope out of my apartment and in front of people.
05/30/2016 at 06:37 •
I started to show the Perceptoscope to people around the interactive arts and hacker scene in LA. It was a nice safe way to demo, get feedback, and begin thinking about how to improve. Along the way, I got introduced to the gang from Two Bit Circus, and was given an opportunity to have the Perceptoscope premiere publicly at their first ever STEAM Carnival.
It was an exciting but terrifying moment. This was more of a proof of concept prototype than a rugged unit designed for persistent public deployment. I was going to need some help to get it ready for the Carnival only about a month away.
I'd been becoming a more active member of my local hackerspace CRASH Space, and with the help of Steve, a fellow member and experienced inventor, worked to get the scope protected and ready.
We added a lock to the bottom of the case so kids couldn't open it, and laser cut a set of acrylic shrouds to cover the gears and encoders. I found an old UPS battery to supply the power, and scrambled to ruggedize the wiring and write the software for the experience.
It was a simple experience. Using a physics engine, a random array of shapes would populate the space. Buttons on the scope allowed the user to shoot at and modify the shapes they were pointing to.
Over the course of four days at the Carnival, the Perceptoscope was played with by hundreds of kids and families, and I gained a huge amount of insight. However, it barely survived the experience, and I was constantly flighting short circuits, software glitches, and aggressive children pulling on delicate assemblies.
Exhausted and feeling a bit defeated, I took some time to regroup while I redesigned a more bulletproof Perceptoscope. The next step was getting together the courage and funding to build it.
07/12/2016 at 18:52 •
There's still a lot of story left to be told about how we made it to the Mark II prototype, but in the meantime I wanted to mention an exciting update to the project that has come together over the past month or so since our time at Maker Faire.
Perceptoscope has been invited to be one of the inaugural residents at the SupplyFrame DesignLab! SupplyFrame is the parent company of Hackaday, and for the next three months we'll be working out of this amazing space to focus hard on building a bunch of new Perceptoscope prototypes. Winners of the Hackaday prize will also have an opportunity for a residency, and we can only imagine all the badass projects that will come out of here in the years to come.
We've only been in the Lab a little over a week, but have already jumped in head first to printing another optical manifold and prototyping some bearing blocks for the next revision. We made a trip to Industrial Metal Supply to stock up on aluminum for the Tormach CNC, and have new yokes being fabricated by a pipe bender nearby.
Each week for the next three months I'll be posting updates of the progress we're making. There are still a number of technical and practical challenges ahead with something as complex as Perceptoscope. As I document our journey, I hope it can shine a light not only on the design and engineering issues hardware projects face, but also things like sourcing a reliable supply chain, engaging in public playtesting, and securing partnerships to help the project grow.
07/16/2016 at 07:42 •
The heart of a Perceptoscope is made of glass.
More specifically an assembly of prisms and lenses. This week I took the opportunity to experiment with some new approaches to building the crystal that sits center.
Beamsplitter cubes were always an especially difficult optical component to source at the specifications I needed. Funny enough if I glue them together, right angle prisms are fairly easy to get.
My original assembly was never really designed with manufacture in mind. I needed to think of something easily reproducible for the supply chains I could establish.
I had an idea for a new type of shell that would close around the optics to hold their alignment, so I mocked up a prototype monocle with some components I had laying around.
This should work fairly well as an approach to the final assembly. I can really optimize the optical properties quickly for a set of given components too.
07/22/2016 at 22:10 •
This week at the DesignLab was mostly about design refinement and training for the heavy duty tools.
I picked up the yokes for the new Scopes from the pipe bender. It should be a much faster way to produce new units quickly and cost effectively. They seem to fit the bearing blocks I printed a while back fairly well too.
The final bearing blocks will be milled out of aluminum, so I did some training to get up to snuff on the Tormach CNC we have in house.
It's going to take a little finesse to get my bearing block design ready to be milled, but I'm excited to see what this CNC can do.
With the supply chain side of my optics fairly figured out, I've been iterating on the housing design a bit. I'm getting fairly close to having something production ready for these next few units. The question will be whether to stick to printing them or start experimenting with resin casting, injection molding, or milling them out of aluminum on the CNC.
07/25/2016 at 22:07 •
Though there was excitement around the first Perceptoscope, it was never designed for everyday deployment. The concept was proven, the seeds of the platform were built, but there was still a lot of work to be done to make it something that locations would be willing and ready to use for decades.
In May of 2015, Perceptoscope was honored to receive a grant from the Knight Foundation Prototype Fund. This not only gave the project some working capital to get the components and manufacturing services required, but also opened us up to a network of like-minded individuals looking to use technology to engage people with the stories around them.
The grant process began with a workshop in Miami on Human Centered Design lead by the LUMA Institute. While there, I worked through a variety of different design thinking exercises with the other grantees. It was a great experience, and had me chomping at the bit to get home and dive in.
Once I was back in LA, I worked few a through exercises with Adam to formulate solutions to some of the bigger challenges we were facing. We wanted to design a platform that would be easy to deploy and manage, and stay focused on building as much of the stack as possible with open source tools.
We settled on transitioning away from our previous ARM/Android/Jmonkey stack, and towards something that would be a little more straightforward. We went with an Intel NUC mini-computer running Linux, and refactored our stack to be built on web technologies that we could quickly spin up with different hardware architectures for ease of experimentation. Conceivably this stack could be completely architecture independent, and fly on either x86 or ARM.
Node.js would handle all the backend scripting to our sensors, as well as host the internal web server that serves the content. Three.js and (what would become) WebVR would handle the 3D rendering. This approach is not only super low overhead compared to a game engine, but opens up the possibility of a Perceptoscope dynamically sharing content to other mobile devices nearby. At its core, Perceptoscope is about activating public spaces, and this new possibility could significantly extend its reach.
Industrial design was also a big focus on this new rev, and some of the most significant improvements were made to what might be considered the most boring components. Steve and I brainstormed at CRASH Space for how to best tackle. To ensure the mechanics wouldn't break down under abuse, we had a custom yoke welded. Bearing blocks and encoder mounts were milled at a local machinist. The mechanical assembly was starting to feel bullet (and kid) proof.
I knew the internals of the Scope would always be evolving, so its skeletal structure was built out of 8020 for ease in mounting whatever needed to go inside. We laser cut shelves to hold components like the screen and optics assembly, and mounted them with standard 8020 gear.
The final step in the manufacturing process was shelling the Scope. Laser cutting acrylic seemed like a fast way to prototype out a single shell, so Steve and I worked out of Craig's Plastics down the road from Crash. Lucky for us, they have an extra large laser cutter there. We started by prototyping the shell in cardboard before cutting the final pieces. Craig built jigs to miter the edges of all the pieces and glued them together seamlessly.
I had designed the shell a bit tight, but it wasn't anything a good angle grind couldn't fix.
All that was left now was systems integration.
Adam designed a daughterboard to hook up all our components to an Arduino Pro Micro, and I built a power regulator sled to break out relevant voltages to the internal systems. We twisted together a wiring harness, and routed it through the body of the scope.
With the systems all in place, it was time to boot it up. Success!
I built a pedestal to hold the Scope , and quickly got it ready for an event Perceptoscope was invited to called "The Future of Cities" at the Los Angeles County Museum of Art. It was down to the wire quashing all the bugs needed to get it ready to deploy, but thankfully we made it.
The Perceptoscope was given a prime spot between some folks from the MIT Media Lab and Jesús Rafael Soto’s kinetic sculpture "Penetrable". On the experience side, I had designed a simple story around El Aliso, the lost council tree of the original native people of the LA basin. Where the tree once stood is now an onramp to the 101, and via the Perceptoscope people could see a 3D representation of the tree in the space in front of them, and then jump to a 360 panorama of what its former site looks like today.
It was a great experience sharing the story of El Aliso and the latest Perceptoscope in a place that meant so much to me as LACMA. My hope now is to bring El Aliso more permanently back to life as a monument in the actual site where it stood.
After an exhausting sprint, I packed up the Scope and headed to Miami to present my findings to Knight for the Prototype Fund's Demo Day. There was still a lot of refinement left to do to get the project ready to scale, but it felt good to share what we had been able to accomplish in such a short sprint.
I started planning for the next few Scopes, and working to set up some more deployments around the city. If I really wanted this to be effective, we'd need a minimum of three scopes to be able to take over a location. Either way, it was time to focus on getting this out of my apartment and into the world.
07/29/2016 at 23:04 •
This week started off with some excitement -- I'm psyched to announce Perceptoscope is a finalist in the SXSW Eco Place by Design competition! I'll be headed to Austin in October to present the latest developments in the project.
In terms of manufacturing this week, the big focus was on prototyping out the stereoscopic optical assembly and starting to mill the bearing blocks on the Tormach.
The optics came out looking mighty nice and the FOV is even better than the last unit.
Milling the bearing blocks was less successful. It seemed like I had gotten my tool paths figured out...
But speed and feed definitely weren't quite right... We ended up breaking the tool holder in a pretty magnificent way.
It's going to take a little more milling and research to get things quite right, but we should be able to get it humming next week.
08/06/2016 at 02:09 •
This week was mostly focused on iterating and experimenting with fabrication techniques.
It started off with a print hot off the Objet of the latest optics housing.
The resolution off this thing is excellent, though it's definitely not the most cost effective approach for each unit. We'll likely use this proof print as a positive to cast some acrylic molds.
On the shell front, rather than vacuum forming or casting, I decided to play a bit with the idea of folding together a shell out of laser cut sheet metal. I want to go for something low poly, a sort of modified dodecahedron.
I used 123D Make to unfold the mesh into something that could be laser cut.
And sent it to the Epilog...
Seems like an interesting approach, though it might drive a fabricator crazy. Vacuum forming will probably be fastest for something with this many facets.
We also finally got our heads wrapped around CNC milling the bearing blocks. There's a definite art to it that's going to take some time to master.
To start off we faced down the stock to the dimensions of the final piece (and to shred away the evidence of last week's mistake). After that we used some adaptive clearing to mill out the pocket.
We'll still need to clean it out with a ball end mill, and then bore out the axle and mounting holes, but it's certainly starting to look study enough to survive.