close-circle
Close
0%
0%

HTC vive lighthouse custom tracking

Custom lighthouse tracking using disassembled sensors, a fpga for decoding the signals and an arduino for wireless transmission.

Similar projects worth following
HTC vive ligthhouse tracking is super nice in terms of accuracy and cost. While a typical mocap system costs a lot of money, a pair of lighthouses cost only approx. 250euro. Then of course you need sensors. But those are rather cheap as well (approx 5 euro). In this log we are going to describe our system, which copies the functionality of HTC system, ie 3D tracking of custom sensors and pose estimation of objects.

We are a student team from Munich, Germany. Our goal is to make Roboy balance and walk. For this we need accurate tracking. We decided to replicate HTC lighthouse tracking for our purposes and in November 2016 a fascinating journey began.
Checkout this excellent review on the HTC lighthouse tracking system.

For the first prototypes we were using openvr/steamvr. Steamvr comes in prebuild libraries and you need special LD_LIBRARY_PATH settings for succesfully running it. This conflicts with our ros kinetic builds and is very messy afterall. Then after one of the recent updates steamvr stopped working on our computers. This was actually the reason we started digging into how HTC might have solved the tracking.

At first we tried decoding the signals from the sensors using Intel Edison and MKR1000, which in case of the Edison turned out to be impossible and in case of the MKR was limited to a small amount of sensors. For the Edison the hardware interrupts were not handled fast enough. This is due to a threaded interrupt system. We also tried using the MCU, which wasn't fit for the job eaither.

The MKR was simply overwhelmed by all the interrupts.

We disassembled one of the HTC vive controllers for getting our hands on those sensors. We noticed the HTC controllers were using an ICE fgpa. So we thought if they use it, there must be a reason.

Then soldered VCC, GND and Signal copper cables (0.1 mm), using enough flux. And covert the sensor with a bit of glue to protect it from accidental damage.

In the previous prototype we were routing all signal cables coming from the sensors in parallel. This turned out to be a bad idea. Because of induction the signals pollute each other. In the vive controller, they deal with it by isolating the signal with VCC and GND. So thats what we are also doing.

In the following picture you can see the complete setup:

  1. The custom object, with 4 sensors
  2. The de0 nano FPGA
  3. The MKR1000

[Note: This is the old setup. New updated setup is described in section 7.]

Notice that there are only 4 sensor signal cabels (grey, blue, yellow, red). The other cables are VCC (purple, orange) and GND (green, brown).

The connection to the MKR is via SPI, where the MKR acts as the Master. An additional pin to the MKR notifies the MKR, when there is new data available. This triggers the SPI transfer.

Our vive tracking consists of a couple of modules:

  1. Decoding the sensor signals and calculating the sweep durations (this is done on the fpga)
  2. Transmitting the sensor values via SPI to the MKR1000
  3. Transmitting the sensor values wirelessly via UDP to the host
  4. Triangulation of the ligthhouse rays
  5. Distance Estimation wrt a calibrated object
  6. Relative Pose correction using a calibrated object

1. Decoding sensor signals

[Note: This is the old decoder. New updated decoder is described in section 7.]

On the de0 we are using a PLL to get a 1MHz (1us) clock

Then we feed the sensor signals to one of these lighthouse modules

The spi module looks like this

2. Transmitting the sensor signals via SPI

[Note: This is the old data format. New data format is described in section 7.]

The MKR acts a the SPI Master. Whenever there is new data available (ie when the fpga decoded a valid sweep), it notifies the MKR via an extra pin. The MKR then starts downloading a 32 bitfield, which encodes the data in the following way:

  • bits 31 - 13: sweep duration (in micro seconds)
  • bit 12: valid sweep
  • bit 11: data
  • bit 10: rotor
  • bit 9: ligthhouse
  • bits 8-0: sensor id

3. Transmitting the 32-bitfield via UDP

The host listens to UDP broadcast messages. We are using google protobuffer for the custom messages. When the host receives a trackedObjectConfig message, it opens sockets for the sensor and logging messages and sends the respective ports via a commandConfig message to the MKR. The MKR is waiting for this message and once received, starts sending the sensor values augmented with a milliseconds timestamp.

This sort of infrastructure...

Read more »

Ligthouse_Tracking_MidT.pdf

Midterm presentation slides of the project. Feel free to take a look :)

Adobe Portable Document Format - 14.53 MB - 02/01/2017 at 12:25

eye
Preview
download-circle
Download

  • 1 × altera de0 nano
  • 1 × arduino MKR1000
  • 8 × HTC vive lighthouse sensors disassembled from a controller
  • 1 × esp8266

  • IMU MPU6050

    Simon Trendel03/08/2017 at 23:04 0 comments

    The next step was to retrieve IMU data from our tracked objects. Even though the de0-nano comes with an accelerometer, we are using an external MPU6050. They are quite cheap, like 3 euro and come with accelerometer, gyroscope and the Digital Motion Processor, which fuses the sensor values to useful stuff, like eg quaternion, gravity. The data is retrieved on the ESP8266 which uses I2C to communicate with the MPU6050. The MPU6050 has an extra interrupt pin, which signals data availability. This interrupt pin is routed into the fpga and the connection controlled there. This is necessary because we are using gpio2 on the esp, which needs to be low on boot from internal flash (if gpio2 is high, this means boot from sd card). We also progressed on the command socket infrastructure, which allows rudimentary control of hardware from the GUI (we can now reboot the esp for example, or toggle IMU streaming). In the video below you see the quaternion from the MPU6050 DMP estimation streamed into our rviz plugin and visualized with the red cube.

    The orientation looks very stable, so we hope to skip the bloody epnp stuff...we will see.

    Below you can see how the MPU6050 is wired up to the esp and the de0-nano.

  • using esp8266 instead of mkr

    Simon Trendel02/28/2017 at 23:49 0 comments

    The MKR1000 has always been an intermediate solution. even though he is quite small, there is an even smaller wifi chip, the esp8266. it has a programmable micro-controller and just needs some resistors, two buttons and a serial programmer to be programmed.

    NOTE: there seems to be a lot of confusion out there, about how to correctly wire the little guy in order to program it. therefor the following should help you get started:

    • do not use the vcc 3.3V of your serial programmer (usb does not seem to provide enough current for the esp, which can peek to 200 mA). instead use a voltage supply with enough current. The esp seems to be very sensitive towards incorrect voltage, which leads to undefined behaviour causing major headaches
    • the slave_select pin is GPIO15 which needs to be low on boot
    • use the following wiring scheme:

    When programming the esp (be it from arduino ide or commandline) hold both buttons pressed, start download, release reset, then flash. your bin should start downloading, if not try again. (make sure you pay attention to the NOTE above).

    We removed the SPI core we were using so far, simply because we didn't trust it. We noticed some glitches in our sensor values and one suspicion was the SPI core. anyways...here is how it looks now:

    The esp is acting as the SPI slave (which makes a lot more sense, because the fpga should control when stuff is send out). One SPI frame for the esp consists of 32 bytes = 256 bits = 8 * sensor_values. In each lighthouse cycle, a frame with up to eight sensor values is transmitted to the esp via SPI. The code for the esp hasn't changed much compared to the previous MKR code, except for the SPI slave and that we send out 8 values via UDP at one go, instead of each sensor individually.

View all 2 project logs

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates