Disclaimer: I am a hobbyist who uses a project like this to learn new things like the USB protocol, OpenCV with stereo vision, object tracking, processing and filtering IMU sensor information and integration of the above into existing large libraries. This project has absolutely NO commercial interest and is only being conducted to learn. If this touches your rights and you feel this learning exercise should be stopped, please let me know.

It's amazing how they messed up even the head tracking! The headset was clearly designed to be worn on the head and the phones location as well as well as the point of center of rotation of the head is known within reason. Say +-2cm. However, during my testing, it became obvious that the simplest 3dof head tracking moves the world around in 6 dimensions! This causes the cameras to act as a method to recalibrate the location of the headset which in turn leads to the world constantly moving in X, Y, and Z! If it wouldn't be AR, this would be highly nauseating! This even goes so far that looking left and right rotates the world around the yaw axis until the stereo cameras pull the world straight again.

The other 3dof for the translation within the world to be able to walk around freely in the space (always having the purple bobble on the floor in sight which better doesn't reflect in any mirrors or windows) are tracked by the stereo cameras alone. Unfortunately, the latency through the used USB connection plus the the heavy filtering of the motion makes the walking within the world an extremely unsatisfying experience:
The world never moves as expected. It starts moving too late and it stops moving too late. The motion is smoothed out and never really lines up with how your head moved in the world.

On top of all that, Lenovo has never released an SDK that would allow anyone to develop any software for it despite literally hundreds of people asking for it on Reddit as well as the Lenovo Support Forums.

I, for one, am extremely interested in AR. It has some interesting benefits over VR because you can see what's around you, you can achieve higher angular resolution and you can project things into your real world without ever leaving it! The potential is amazing! Plus, the headsets image quality is actually quite impressive! It's pixel sharp from edge to edge, the field of view for this device is quite alright for this price point and the brightness with even an iPhone on full blast is good enough to play games or display photos in reasonably bright environments. Darker is obviously better.

I am not an expert USB reverse engineerererer but maybe someone is out there who is interested to chime in and help. I am looking for a 4K RGB stripe display with lots of nits in a small form factor and an IMU to make this all work.

However, the hard part is gaining access to the output of the stereo image processing to access the location data of the markers or use the raw camera data to extract my own feature set.

Goal: 

  • Learn! I am not a vision or USB expert at all! That's why I'm doing this. To learn!
  • Create an open source AR headset based on the currently insanely cheap original Lenovo Mirage AR headset with 6DOF head tracking using tracking markers on surfaces, stand up pins or walls (no real inside out tracking - unless you already have one that you can offer) as well as tool tracking to be able to create one of more input devices directly tracked from the headset.
  • Unreal Engine Support
  • Unity Support