Close
0%
0%

HiveTracker

A tiny, low-cost, and scalable device for sub-millimetric 3D positioning.

Public Chat
Similar projects worth following
This project is about a miniaturization of the Vive Tracker by HTC. It allows sub-millimetric 3d positioning, and embeds a 9DoF IMU with sensor fusion. It's cheaper than other 3d positioning system and allows tracking way more objects.The entire project is open source, all the materials can be found below, or here: http://HiveTracker.github.io

Latest News

Thanks to all of you who support and follow us, your feedback is hugely helpful...

___________________________________________________________________________

Challenge

Indoor 3d positioning is a problem that has been fought for a while but there is no perfect solution.

The closest available is probably the motion capture system using cameras, it's very expensive and doesn't scale (above a dozen of tracked objects the system lags).


Solution

After ultrasound and magnetic approaches (see legacy), it looks like lasers are not too bad!

We made our own board that uses photo sensors to piggy back on the lighthouses by HTC Vive.

This project gave birth to a first academic publication in the HCI conference called Augmented Human, the published details are very useful, but don't miss the logs for even more details.

Anyway, here is a demo:

Wait, but why?

With backgrounds from HCI (human computer interactions), neurosciences, robotics, and other various engineering fields, we all found that miniature 3d positioning challenge was worth solving.

Some of the main applications we are interested in are listed here:

- fablab of the future to document handcrafts accurately and help teaching it later through more than video (using haptic feedback for example)

- robotic surgery controllers can greatly be improved (size, weight, accuracy...)

- dance performances can already be greatly improved with machine learning (example) but not that easily in real time.

- rats "hive" positions and brain activities tracking is probably the most complicated challenge, here is a simpler example using computer vision:


How does it work?

HTC Vive uses 2 lighthouses that emit laser signals accurate enough to perform sub-millimeter 3d positioning. The system looks like this:

This gif summarizes fairly well the principle, but clicking on it will open an even better video:

The idea is to measure timings at which we get hit by this laser sweep.

As we know the rotation speed and as we get a sync pulse broadcast, we can get 2 angles in 3d from each lighthouse, and estimate our position with the intersection (see details in the logs).

Our approach is to observe these light signals and estimate our position from them.

Trick alert!

Those signals are way too fast to be observed by normal micro-controllers, but we found one (with BLE) that has a very particular feature called PPI, it allows parallel processing a bit as in an FPGA.

Checkout the "miniaturization idea validation log page" to learn more about it.

In Progress

We're still fighting with a few problems, but as we're open sourcing everything, maybe the community will help?

Please contact us if you are interested in helping, below are some of the current issues that need some love. We have the solution to all the problems, but more expert hands help can't hurt:

- Kalman filtering: we're using the fusion of the accelerometer integration and optically obtained 3d position to improve our results. Our proof of  concept is getting close to usable, now we need to port it.

- Calibration: when using the trackers at 1st, finding bases positions & orientations is not trivial, we have a procedure but it could be greatly improved...

 

Who Are We?

This is a research project made with L❤VE by "Vigilante Intergalactic Roustabout Scholars" from @UCL, @UPV & @Sorbonne_U. It stared in the hackerspaces of San Francisco, Singapore, Shenzhen and Paris, then took a new home in academia, where knowledge can stay public.

Contact:

- Updates: https://twitter.com/hive_tracker

- Mailing list: https://groups.google.com/forum/#!forum/hivetracker

- Team email: hivetracker+owners@googlegroups.com...

Read more »

  • 1st characterization step

    Drix08/21/2018 at 21:16 0 comments

    As in any incremental development, we tried to get a simple working version to characterize it and improve it accordingly.

    HTC Vive system consistency

    The 1st step was to verify that we can measure a coherent 10 mm translation wherever we are in our interaction zone.

    We used a CNC to sample positions along a line that can be seen in the following video:

    As represented in the picture below, we moved the CNC on 7 locations to have a reasonable overview:

    Distortion

    You probably saw it coming, the HTC Vive positioning system is not linear depending on the bases placements.

    It makes sense, and it's actually OK as a distortion map can be estimated, and real positioning can be predicted from it.

    Jitter

    A better filter characterization will be published soon, but the below measures were captured for the position #3 and #6, they show the measured noises in average, depending on the chosen algorithm:

    You might have noticed that in our current Madgwick and Kalman implementations, their performance is complementary depending on the distance.

    Overall, the accuracy can be estimated below 1 mm when far from the bases, and around 0.5 mm when close to the base.

  • 3d Positioning is working!

    Drix08/05/2018 at 09:51 0 comments

    1 picture is worth 1000 words, here are a moving ones!

    Here we basically demonstrate the principle of out positioning approach:

    The bases send their laser sweeps, we measure the timings, convert them to 3d angles, and observe the 3d intersections.

    This is made in this blender visualization:

    https://github.com/HiveTracker/Kalman-Filter/blob/master/visualisation.blend

    The math behind this idea is derived here:

    https://docs.google.com/document/d/1RoB8TmUoQqCeUc4pgZck8kMuBPmqX-BuR-kKC7QYjg4/

    Finally, the 3d models are available here:

    https://github.com/HiveTracker/Kalman-Filter/blob/master/Frames

  • Libraries validations

    Drix08/05/2018 at 09:51 0 comments

    Open source != easy

    Sharing software or hardware is not the only step to make open source projects accessible.

    Developing with open programs, and affordable tools are important aspects too.

    But many embedded systems purists would never accept to use Arduino in a serious project, and from the difficult Twiz experience with users, it seemed important to find a trade-off.

    Luckily, the excellent work by Sandeep Mistry allows it reasonably:

    github.com/sandeepmistry/arduino-nRF5

    Hello world

    Once the environment validated, it was possible to test the boards, it might seem simple but blinking LEDs can sometimes be very challenging, mostly in the Chinese manufacturer's office:

    photos.app.goo.gl/LxQBnC5XiCdQhxJa7

    Complex features

    Back home, it took a while to validate all the firmware components.

    The most complicated being the PPI, we found inspiration in the ex-arduino.org repos, and got it to work:

    github.com/HiveTracker/PPI

    To reach this point, many steps were necessary, but the main trick was to use a logic analyser.

    The one visible here was made by saleae, but for those with small budget, aliexpress has a few affordable alternatives.

    For those who only work with open tools, Sigrok can help, but the software by Saleae is free + cross platform and is quite practical:

  • 1st Hardware prototype

    Drix08/05/2018 at 09:50 0 comments

    With the miniaturization idea validated, we focused on using only 4 photodiodes as putting them on a tetrahedron allows having at least 2 visible ones by the lighthouses, for any rotation.

    Main parts

    - MCU + BLE: To save some space, we use a fairly small nRF52 module (8 x 8mm) by Insight.

    - IMU : To save some time, we use the only 9DoF sensor that performs the full sensor fusion, the bno055 by Bosch.

    The PCB design files are on upverter.com/hivetracker, but here is an overview of what it looks like:

    It was finished and built during yet another Hacker Trip To China, hanging out with the Noisebridge gang and tinkering with magic parts found in the Shenzhen electronic market.

    Making in China

    Note: there are 2 periods of 1 week when almost everything is closed in China, lunar new year is probably the most famous, but the golden week holiday is very serious too.

    This board was made during the golden week so it was not easy, but even though some chosen parts are approximative, some manufacturers such as smart-prototyping.com are more flexible and they got the job done:

  • Miniaturization Idea Validation

    Drix08/05/2018 at 09:50 0 comments

    To miniaturize the HTC Vive tracker, we needed to make our own board, and went through several ideas, here is a summary:

    Problem

    To achieve accurate timestamps of several photodiode signals in an MCU, using interrupts would not be ideal as they would add delays and we would miss some of them.

    1st idea

    We 1st thought about using the tiny FPGA by Lattice (iCE40UL-1K), it's only 10 x 10 mm and can be flashed with the Ice Storm open source tool-chain: clifford.at/IceStorm

    Trick

    ...but we were planning to use the nRF52 radio-MCU by Nordic, and remembered that it has a very practical feature that would allow doing those timestamps in parallel without interrupt.

    This feature is called PPI Programmable Peripheral Interconnect:

    https://infocenter.nordicsemi.com/index.jsp?topic=%2Fcom.nordic.infocenter.nrf52832.ps.v1.1%2Fppi.html


    Validation

    To verify that the timings were possible, we enjoyed the great talk* by Valve's engineer Alan Yates:

    This video helped us simplifying our feasibility estimation, below are the quick & dirty numbers:

    - Max base to tracker distance = 5m (5000mm) and we want an accuracy 0.25 mm = Ad

    - Motor speed = 60 turns/sec => 1 turn takes 1/60 = 16.667 ms = T

    - During T, the travelled distance = 2 Pi R = 31.415 m = D

    - Time accuracy = T / (D/A) = (1/60) / (2 Pi * 5 / 0.00025) = 132.6 ns = At

    - PPI sync @ 16MHz => 62.5 ns = Tppi

    => Theoretical accuracy = Ad / (At / Tppi) =  0.25 / (132.6 / 62.5) = 0.118mm!

    *Video source:

    https://hackaday.com/2016/12/21/alan-yates-why-valves-lighthouse-cant-work/

  • Proof Of Concept

    Drix08/05/2018 at 09:49 0 comments

    History

    Before diving in the proof of concept, it's worth reminding what was available before.

    The main alternative is a camera based solution and the median cost is $1999:

    http://www.optitrack.com/hardware/compare/

    This solution doesn't even scale anyway, above a dozen of trackers it starts lagging.

    We also needed to embed intelligence, so we decided to look at what was possible and we made our own.

    This project was inspired by various others, here are the fundamentals.

    Legacy

    First off, from inertial sensing to 3d positioning, we went through various steps of thoughts, analysis and experimentation, here are relevant links:

    - Twiz - Tiny Wireless IMUz: hackaday.io/project/7121-twiz

    - Wami - Ultrasound 2d positioning: rose.telecom-paristech.fr/2013/tag/wami

    - Plume - Magnetic 3d positioning: hackster.io/plume/plume

    Inspirations

    The followings helped tremendously to understand the details of this system, thanks a lot to them:

    - Trammell Hudson: https://trmm.net/Lighthouse

    - Alexander Shtuchkin: github.com/ashtuchkin/vive-diy-position-sensor (+ wiki !)

    - Nairol: github.com/nairol/LighthouseRedox

    1st feasibility test

    We first replicated the above inspirational approach as proof-of-concept, and compared it against the hand-held controllers of the commercial Valve tracking system in an ideal room. We taped a non-regular hexagonal shape on the floor of the testing room, then traced this shape by hand with both devices, recording the devices’ positions using Bonsai.

    Here is a visualisation of the preliminary results:

    This proof-of-concept, which uses only one photodiode, had an average error on the order of 10 mm more than the average error of a commercial tracker.

    Although this approach is not pursued anymore, the code is available :

    bitbucket.org/intelligence-lab/lighthouseserver

View all 6 project logs

Enjoy this project?

Share

Discussions

ktomcat wrote 09/17/2021 at 08:43 point

HI~  Can it be adapted to lighthouse 2?

  Are you sure? yes | no

bincvfx wrote 06/28/2020 at 12:11 point

interested .. i just started the github photodiode project  for chops . Let me know if there is anything i can help with 

  Are you sure? yes | no

max wrote 03/05/2020 at 16:31 point

all you really need is some lasers,  most of the work is going to be done in the software side since you can get positioning, maybe orientation from the deflection of the light, and other data just from one diode.

  Are you sure? yes | no

Timo Birnschein wrote 03/05/2020 at 16:27 point

Is someone still working on this project?

I would be interested to help. A true open source, KIS-tracker would be amazing to have. Please reach out if this project is still ongoing.

  Are you sure? yes | no

Drix wrote 03/05/2020 at 16:50 point

  Are you sure? yes | no

terencetsengamzon wrote 02/24/2020 at 18:01 point

I am very interested in your project, may I know how I can purchase your product?

  Are you sure? yes | no

cyberdenss wrote 11/06/2019 at 19:33 point

Hello!
when there will be improvements on the list
 
TODO list:

- Make current BLE firmware more robust
- Embed calibration (to allow the trackers to measure bases coordinates)
- Embed filtering/fusion (Kalman, etc)
- LightHouse V2.0 adaptation w/ FPGA
- Cost optimizations...

  Are you sure? yes | no

Faraz Khan wrote 07/24/2019 at 16:54 point

Hey guys im interested to purchase a tracker as well. Unable to join the google group though - do I need a invite? Thanks!

  Are you sure? yes | no

moritz.skowronski wrote 08/02/2019 at 11:06 point

I am also super interested in the project and would love to purchase one as well!

  Are you sure? yes | no

xun.xie wrote 06/24/2019 at 02:43 point

before you officially release it as a product ready for sale, may i get a sample for test? we have a smart glass plan at this moment is seeking for positioning solution beyond SLAM. 

  Are you sure? yes | no

Marco wrote 05/07/2019 at 08:57 point

may I know where is major routine for parsing lighthouse timing data to position ?

  Are you sure? yes | no

Drix wrote 05/07/2019 at 09:11 point

For the moment, this blender version is the most advanced:
https://github.com/HiveTracker/Kalman-Filter

  Are you sure? yes | no

Marco wrote 05/07/2019 at 09:15 point

thank you very much! i check now

  Are you sure? yes | no

Marco wrote 05/08/2019 at 04:13 point

i have check your code, that's cool!

one question, there is a matrix on lighthouse position and rotation, it confused me... how did you get the magic matrix? run SteamVR in PC with HTC tracker then read the result?

  Are you sure? yes | no

Drix wrote 05/08/2019 at 11:49 point

Yes, to get the magic matrix, we run SteamVR in PC with HTC Vive kit, then read the result with Bonsai:
https://github.com/HiveTracker/bonsai-interface

  Are you sure? yes | no

cyberdenss wrote 11/07/2019 at 20:42 point

Hi! How does it determine the coordinates of base stations. Can it be used without the coordinates of base stations?

  Are you sure? yes | no

SeongBong Bae wrote 01/11/2019 at 05:59 point

I can't find the receiving the data PC side application information.

Can you tell me about the PC application that track the hive tracker on the demo? 

If possible, I wonder if I can get the project source code of the PC application side.

  Are you sure? yes | no

Drix wrote 05/07/2019 at 09:10 point

For the moment, the blender version is the most advanced:
https://github.com/HiveTracker/Kalman-Filter

  Are you sure? yes | no

CNLohr wrote 11/05/2018 at 19:27 point

Awesome meeting you at the superconference...  Just FYI there has been more discussion in the 2.0 decoding stuff recently, you may want to check out here: https://github.com/cnlohr/esptracker/issues/1

I am still specifically interested in an ESP32-based approach, but, I am confidence we can leverage each other's work to some degree.


  Are you sure? yes | no

Lee Cook wrote 09/06/2018 at 12:02 point

Are you still using off the shelf sensor modules or have you started to make your own? I see that you've libraries for the new data+envelope chip but I can't find schematics for the sensor board.

  Are you sure? yes | no

Drix wrote 09/16/2018 at 11:23 point

  Are you sure? yes | no

Neizvesten wrote 09/05/2018 at 13:44 point

Is the miniature tracker compatible with SteamVR Tracking 2.0?

  Are you sure? yes | no

Drix wrote 09/16/2018 at 11:24 point

Sadly not for the moment, and it might need to add a little FPGA...

  Are you sure? yes | no

Adrian Onsen wrote 08/22/2018 at 18:01 point

This is a fantastic project!
You mentioned sub-milimeter accuracy. Have you done any testing to figure out just how precise the tracker motion can be?
Are we talking 0.9mm? or are we talking 0.1mm?

Cheers!

  Are you sure? yes | no

Drix wrote 08/23/2018 at 00:26 point

Thanks for your interest ;)
We just started a 1st round of characterization: *for now* we are between 0.25mm and 0.8mm depending on the distance.
There will be a project log about it in the next weeks hopefully ;)

  Are you sure? yes | no

Lhun wrote 09/04/2018 at 15:59 point

this sounds great! I asked this earlier but didn't get a clear answer: I'm interested in using this as an alternative to vive trackers, but we would like them to work AS IF they were vive trackers without using software emulator trickery - to make them compatible with existing "vive tracker" compatible applications. I didn't see a direct openvr driver on your github, if I missed it I'm sorry, but I would be interested in doing something like this: https://github.com/spayne/soft_knuckles but that is reporting to steamvr as a tracker.

  Are you sure? yes | no

Drix wrote 09/05/2018 at 00:41 point

Hey,
Making our trackers compatible with HTC Vive should be possible but we're pretty far from it, we build this things step by step ;)

  Are you sure? yes | no

bosse wrote 08/05/2018 at 19:32 point

Will it be possible to buy your hardware in smaller quantities?

I would like to try to use it for my application. Our small project do not have the capacity to design our own hardware.

We are using similar hw, bno055 and nrf52832 to create a headtracker with head direction info only.

https://github.com/bossesand/OHTI

  Are you sure? yes | no

Drix wrote 08/06/2018 at 12:08 point

Sure!

If you subscribe to the mailing list we'll announce when we make another batch:

https://groups.google.com/forum/#!forum/hivetracker

Thanks for your interest ;)

  Are you sure? yes | no

rombobellogia wrote 12/12/2020 at 20:00 point

Hey Drix, we're looking forward to see if/how we can make use of this for tracking real cameras/lights into UE4/5

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates