Motion Capture system that you can build yourself

An open hardware-software framework based on inertial sensors that anyone can use to build a human motion capture system.

Similar projects worth following
A couple of years ago I wanted to make a digital performance with a dancer on stage, and wanted to use a mocap suit to get his movements. There was none available at an affordable price, so I started the development of this one.

In the meantime, several cheaper options came out, but those remain out of the capability of most users and, more importantly, they work under proprietary licenses. You can’t modify the way they work, or use part of them in another software project.

As an alternative, this is a motion capture system that can be easily assembled by anyone in order for them to start capturing as soon as they are able to build it. Additionally, it is an open hardware-software framework that can be freely tweaked, enhanced, or used as part of another project.

Chordata is a motion capture system that you can build yourself. Our goal is taking the world of motion capture to the universe of DIY technologies with an Open Hardware approach.

For a quick introduction you can watch the video below, or visit our website.

This project consists on 3 parts:


Motion capture is about getting the orientation of every body limb or part at real time as accurate as possible. A simple MEMS IMU device*, and freely available sensor fusion algorithms are enough to get a decent result. The problem starts when you want to get the data of several devices. Most of this devices came with an i2c interface, but their address is fixed in the hardware. So one of the building blocks of Chordata is the sensing unit capable of coexisting with several “siblings” on the same bus. At the moment I have developed the “KCeptor” that allowed me to develop the rest of the project. It consists of a LSM9DS1 IMU, and a LTC4316 i2c address translator. The focus of the whole project is to reduce costs, so all the passive components on the board are through hole, passing most of the work of assembling from the industrial manufacturer to the final user while saving money in the process.

Software (Hub):

Getting the data of a lot of sensors on real time, processing it, and send it in an easy-to-read format to some client is not a simple job, so I’m developing a software from scratch to deal with it.

It is responsible for:

  • Building a digital model of the physical hierarchy of sensors. Initializing the i2c communication on the Hub, and running the configuration routine on each of the sensors.
  • Performing a reading on each of the sensors at the specified refresh rate.
  • Correcting each sensor reading with the deviation obtained on a previous calibration process.
  • Performing a sensor fusion on the corrected sensor reading, obtaining absolute orientation information in form of a quaternion.
  • Sending the orientation data, together with the sensor_id and a timestamp to the client using an open protocol (such as OSC)

After several testing I discovered that using an Single Board Computer running linux was the best choice to host such a program, so all of the development of this part of the software has been done on C++, using a Raspberry Pi 3 as the hub. Some of the advantages of this type of hub, in comparison with simpler microcontrollers are:

  • It’s not an expensive component.
  • Programming and debugging is enormously simplified.
  • Some of them, like the rPi3, came out of the box with all the communication peripherals needed to perform a confortable capture, with the remarkable example of the Wifi adapter.

The choice of performing the sensor fusion inside the hub is based on:

  • Mayor cost of the IMU units capable of performing sensor fusion on-chip
  • Mayor accuracy of the sensor fusion performed after the raw data is corrected by a previously done calibration.
  • Since the bandwidth in the i2c bus generates a bottleneck in the sensor’s data acquisition, the fusion sensor processing inside the hub doesn’t add a significant overhead.

Software (Client):

Since the protocol with which the data is transmitted is clear, the client can be anything that is capable of displaying a 3D skeleton.  

Most of the time I'm using a python script running in Blender that grabs the quaternion data from OSC, and rotates the bones of a 3D armature.

The idea would be releasing a basic client in the form of a Blender add-on responsible for:

  • Establishing some handshake communication with the hub, checking compatibility and status.
  • Communicate the status to the user (the human in front of the computer).
  • Act as a GUI to run the on-pose calibration procedures...
Read more »

IMU Proto - EAGLE and

Source and gerber files for the IMU-Proto board based on a LSM9DS0 sensor and a PCA9544A i2c multiplexer

Zip Archive - 148.65 kB - 09/29/2017 at 08:55


  • 1 × Raspberry Pi 3 or other SBC
  • 15 × LSM9DS1 (Number of units for an average full body suit) Semiconductors and Integrated Circuits / Misc. Semiconductors and Integrated Circuits
  • 15 × LTC4316 (Number of units for an average full body suit) i2c address translator
  • 36 × RJ12 6p6c connector (Number of units for an average full body suit) Passive
  • 15 × 240C2C (Number of units for an average full body suit) EEPROM memory

View all 8 components

  • New video, web and social

    Bruno Laurencich4 days ago 0 comments

    Today we've launched a new video explaining all the features of Chordata, the Open Source motion capture system that you can build yourself. Make sure to check it out in our YouTube page and give it a like.

    We’ve also launched our new website, which features links to all of our social networks. With all of these tools, you’ll now be able to follow the evolution of Chordata without missing a single step of the process.

    We hope you enjoy them!

    All this couldn’t been possible without the work of our new team members, Juancho and Flavia, who are helping out with communications and social media.

    If you want to give a little help to the project, is as easy as giving it a like here on hackaday, which will help us to stand out on the Human-Computer Interface round, in the current edition of the Hackaday price.

  • Hackaton postmortem

    Bruno Laurencich07/29/2018 at 14:45 0 comments

    The motion capture hackathon took place last week . It was great to see the Chordata system used on so many different projects!

    Here's a quick recap of the work done:

    Arianna: a great prototype for an easy low cost DIY sensor attachment suit.

    Antonio: networking improvements. SBC as an access point.

    Emanuele: 3D modelling in Blender with mocap

    Lorenzo and Matteo: Mocap suit as musical controller in Max/MSP

    Kine: thanks for soldering all those sensors!

    Massimo and Sara: processing SDK foundations. This gives the user a simplified interface to work with the mocap information.

    Mauro and Alexia: Mocap suit as a musical controller in a Supercollider

    Stefano: Unity 3D integration. Making visuals to go with the music.

    Andrea: Some mysterious project (as usual)

    We gathered lots of information about bugs, and possible interface improvements!

    Not to forget, of course... The winner of the hackathon by unanimous decision: Valerio and his dinosaur capturing tail prosthesis!!

    Many thanks to all the partecipants. We had a great time!

  • Motion capture Hackathon

    Bruno Laurencich07/10/2018 at 15:03 0 comments

    On July the 21th the first Chordata motion capture Hackathon will take place!

    It will be an occasion for users to test the system before the release, and for as to catch some bugs ;).

    The idea is bring together performative artists and digital creators to explore the possibilities of mocap as an artistic discipline.

    It will take place in Rome, at Workco coworking. If you live somewhere else a stream of capture data will be available to let you work from home!

    If you want to participate fill the here, or just leave a comment below.

  • The first release is coming!

    Bruno Laurencich05/25/2018 at 17:32 0 comments

    In the last months we were working hard to achieve the stability in the capture required in any real world scenario, and to be honest the results are not bad at all. So it’s time now to stop with this “improving stability” direction and focus on making the minimal interface changes in order to release it.
    We kept it to ourselves all this time because we felt like there was no sense in publishing a shaky, or hardly usable system. But at this point it would be great to have other hands rather than ours using and testing it, finding bugs or imagining a better user experience.

    So, we set august 2018 as a first release date. Looking forward to it!


    In the meantime we’ve been showcasing in a couple of places:

    Mmkm STUDIO @ Open House Roma.

    The friends of mmkm studio invited us to present a little performance with dance, music, and live-generated visuals. It was the first time we used the Chordata suit for what it was originally conceived for.. Three years ago!

    But it was also the first time the suit was used on a real “stage” in front of other people, having to deliver correct results during a definite period of time, and it went pretty well. This was the kind of proof we needed to convince ourselves that the little bird is ready to leave the nest.

    Hope you like it:

    For this presentation, and most of the time we are using a 3D model by AlexanderLee,  thanks Alexander for sharing it!

    Codemotion Rome:

    This is one of the most important programmers conference in Europe. It was great to be able to share the guts of the software implementation with so many people, we got tons of interesting opinions and advices.

    We would like to thank Ugo Laurenti and Sara Di Bartolomeo for been there helping out with the stand, and also Flavia Laurencich and Juancho Casañas for the support.

    Thanks a lot!

  • Chordata @ Maker Faire Rome 17

    Bruno Laurencich12/11/2017 at 20:32 0 comments

    It's been a while since I don't post any update here.

    Last month we were very busy in the making of a better looking prototype to be shown on this year edition of the Maker Faire, here in Rome.This new prototype really shows the potentialities of the system, even tough several adjustments and calibration procedures should be still implemented.

    Public's response was really positive and several people showed interest in accessing to a cheap and hackerable mocap system. This response convinces us on putting more effort on its development.

    We also received some exciting collaboration proposals so perhaps we'll have some interesting announcements to make on the incoming weeks. 

    Stay tuned.. 

  • Hands on the second physical prototype

    Bruno Laurencich11/09/2017 at 18:22 0 comments

    I finally have all the parts to start building a second and more complete version of the system. The PCBs, stencil, and components were already arrived some time ago, but the solder paste kept me waiting for a long time, if you want to hear the whole story, please refer to The solder paste odyssey.

    If everything goes right I should be able to build at least twelve of the sensor nodes, and arrange my first whole-body capturing suit.

    The problem: apart from the general refactoring that I’m performing on the code, I will have to implement the lecture of the LSM9DS1 instead of the LSM9DS0 from the previous board. Fortunately sparkfun offers a library for arduino, which can be easily adapted to this system

    The real problem: home soldering 12 of these units with their tiny LGA packages, represents 12 one-shot opportunities... of getting it wrong.

    The article linked below also shows some of the cutting-edge equipment with which I’ll shoot these twelve shots.

  • Refactoring the PCB

    Bruno Laurencich10/19/2017 at 19:14 0 comments

    As I said, it was time to make a second version of the PCB in order to be able to build a complete body suit. I’ve called it “K-Ceptor” (Kinetic perCEPTOR).

    The changes are detailed on the previous log entry, and listed here:

    • Changed the LSM9DS0 for the LSM9DS1
    • Added an address translator and removed multiplexer
    • RJ-12 connector for both input and output (or optionally solder a regular 2,45mm header)
    • An EEPROM memory

    One thing that I hadn’t planned for (it came out while I was making this new pcb) was to arrange some of the components on a separate board: the “id_module”.

    This module is a tiny, one-layered pcb, containing the EEPROM, some resistors to setup the translation value of the LTC4316 (i2c address translator).

    This separation allows for greater flexibility and hardware resources reutilization. For example, suppose some user has a complete suit and, at some point, he is using it for two different activities taking place in different environments, namely: a capture for an animation performed outdoors and the rehearsals of a live performance in a theater. Since the electromagnetic interference on each location is completely different, the ideal would be to perform a calibration* on each sensor at least once for each place. Having a duplicate set of cheap id_modules would allow the user to easily apply the corresponding calibration before each use.

    (*) again: I'm talking about the sensor calibration, not to be confused with the pose calibration which should be performed before every capture.

    A render of the id_module stacked in position, on top of the K-Ceptor.

  • Current situation and ongoing work

    Bruno Laurencich10/03/2017 at 09:04 0 comments

    Here's a video showing the current state of the capture. This 3 sensor prototype it's with what I've been working on in the last months, even if it's not as spectacular as a whole body capturing suit, it allowed me to easy test the features as they were implemented. 

    The focus on this part of the development was put on:

    • General stability of the program.
    • Capability for reading sensors arranged on any arbitrary disposition (or hierarchy).
    • Obtaining readings of each of the sensors at a regular interval, no matter in which part of the hierarchy it was.
    • Capability for reading a single sensor on the hierarchy, process it's raw data and generate calibration information. Dump this information to a file.
    • Implementing a correction step for each sensor, with data obtained on a previously done calibration, before the sensor fusion. 

    The Imu-Proto sensor unit,

    physical base of the prototype is simple pcb featuring a IMU sensor, and an i2c multiplexer. The idea was that this units should be easily interconnected allowing the creation of tree shaped hierarchies. So it had a 4 pin input and output carrying the current, and the i2c bus. It also exposed pins for the secondary gates of the multiplexer.

    This arrangement was great for testing, but now I'm working on the creation of more user-friendly version of the sensing unit, which will have the following features:

    • -Lack of a multiplexer which will be on a separate unit, and instead implement an address translator.

    The multiplexer works fine, but it wasn't really used on all nodes, in the other hand it added unnecessary overhead to the bus, having to switch it for each sensor before the reading. Instead having it on a separate unit will allow a more flexible creation of arbitrary trees.

    • An easy pluggable connector.

    Of course, having to solder 4 wires in order to create the suit wasn't flexible at all. This connector should allow the performer to freely move while keeping the connection stable, should be cheap and common, and not excessively  bulky. At the moment I'll go with the RJ-12 connector (the one regular telephones use).

    • An on-board memory.

    The main function of this memory will be storing the sensor calibration data. This calibration should only be performed once in a while*, and until now the generated data was stored on a file in the Hub, not allowing a particular sensor to change Hub, or position on the hierarchy.

    (*) I'm talking about the sensor calibration, not to be confused with the pose calibration which should be performed before every capture.

View all 8 project logs

  • 1
    0) Comming soon..

    We are working hard to give you the best experience when building your own system. All the software, hardware and sources will be available soon. Once released, the building process will look something close to the steps that follow:

  • 2
    Get the hardware

    a. (The easy way):

    Buy pre assembled kits at our Tindie store. Each one comes with the board and some pre-soldered components, and includes a few additional components to be soldered by you.

    b.  (The hard way):

    Download the sources, send them to your favorite PCB manufacturer. Buy the components and solder everything together.

  • 3
    Additional hardware:

    Apart from our hardware you will need a regular Microcomputer like the Raspberry pi. All the sensors and the Microcomputer go fixed to the performer’s body, and connected together with telephone-like wires (6 cores, ~26awg, with crimped rj-12 connectors)

View all 7 instructions

Enjoy this project?



JAMES MARK WILSON wrote 07/18/2018 at 05:34 point

great stuff

  Are you sure? yes | no

Bruno Laurencich wrote 07/20/2018 at 10:11 point

thanks :)

  Are you sure? yes | no

Sophi Kravitz wrote 07/16/2018 at 18:14 point

Pretty exciting project... def interested!

  Are you sure? yes | no

Bruno Laurencich wrote 07/20/2018 at 10:11 point

Honored by your interest Sophi, I saw you doing pretty cool stuff around here!

  Are you sure? yes | no

Patrick Lowry wrote 07/16/2018 at 15:30 point

Awesome, guys.  This is something I've been thinking about for a while now, so very please to see someone actually doing it.   I'm keen on mixing the mocap with VR.  I look forward to you release the instructions in August.  Best of luck.

  Are you sure? yes | no

Bruno Laurencich wrote 07/16/2018 at 17:10 point

Thanks! looking forward to see what you can do with it!

  Are you sure? yes | no

Francois Medina wrote 05/02/2018 at 15:10 point

Amazing project! I am eager to build mine alongside the instruction when it will be release.I am an absolute zero at these electronic things but I would love to build a mocap suit for 3D animation. As a 3D artist, if I can help you in some ways, please reach out to me.

  Are you sure? yes | no

Bruno Laurencich wrote 05/25/2018 at 17:20 point

hi Francois, I have good news for you, the first release is coming! The idea is to create a system that can be assembled with no previous knowledge, only the will to do it ;)

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates