Motion Capture system that you can build yourself

An open hardware-software framework based on inertial sensors that anyone can use to build a human motion capture system.

Similar projects worth following
A couple of years ago I wanted to make a digital performance with a dancer on stage, and wanted to use a mocap suit to get his movements. There was none available at an affordable price, so I started the development of this one.

In the meantime, several cheaper options came out, but those remain out of the capability of most users and, more importantly, they work under proprietary licenses. You can’t modify the way they work, or use part of them in another software project.

As an alternative, this is a motion capture system that can be easily assembled by anyone in order for them to start capturing as soon as they are able to build it. Additionally, it is an open hardware-software framework that can be freely tweaked, enhanced, or used as part of another project.

Chordata is a motion capture system that you can build yourself. Our goal is taking the world of motion capture to the universe of DIY technologies with an Open Hardware approach.

For a quick introduction you can watch the video below, or visit our website.

Or perhaps you prefer to dive directly into the code and KICAD sources at our repositories at gitlab.

This project consists on 3 parts:


Motion capture is about getting the orientation of every body limb or part at real time as accurate as possible. A simple MEMS IMU device*, and freely available sensor fusion algorithms are enough to get a decent result. The problem starts when you want to get the data of several devices. Most of this devices came with an i2c interface, but their address is fixed in the hardware. So one of the building blocks of Chordata is the sensing unit capable of coexisting with several “siblings” on the same bus: the “K-Ceptor” It consists of a LSM9DS1 IMU, and a LTC4316 i2c address translator. The focus of the whole project is to reduce costs, so most components on the board are through hole, passing most of the work of assembling from the industrial manufacturer to the final user while saving money in the process.

Software (Hub):

Getting the data of a lot of sensors on real time, processing it, and send it in an easy-to-read format to some client is not a simple job, so I’m developing a software from scratch to deal with it.

It is responsible for:

  • Building a digital model of the physical hierarchy of sensors. Initializing the i2c communication on the Hub, and running the configuration routine on each of the sensors.
  • Performing a reading on each of the sensors at the specified refresh rate.
  • Correcting each sensor reading with the deviation obtained on a previous calibration process.
  • Performing a sensor fusion on the corrected sensor reading, obtaining absolute orientation information in form of a quaternion.
  • Sending the orientation data, together with the sensor_id and a timestamp to the client using an open protocol (such as OSC)

After several testing I discovered that using an Single Board Computer running linux was the best choice to host such a program, so all of the development of this part of the software has been done on C++, using a Raspberry Pi 3 as the hub. Some of the advantages of this type of hub, in comparison with simpler microcontrollers are:

  • It’s not an expensive component.
  • Programming and debugging is enormously simplified.
  • Some of them, like the rPi3, came out of the box with all the communication peripherals needed to perform a confortable capture, with the remarkable example of the Wifi adapter.

The choice of performing the sensor fusion inside the hub is based on:

  • Mayor cost of the IMU units capable of performing sensor fusion on-chip
  • Mayor accuracy of the sensor fusion performed after the raw data is corrected by a previously done calibration.
  • Since the bandwidth in the i2c bus generates a bottleneck in the sensor’s data acquisition, the fusion sensor processing inside the hub doesn’t add a significant overhead.

Software (Client):

Since the protocol with which the data is transmitted is clear, the client can be anything that is capable of displaying a 3D skeleton.  

Most of the time I'm using a python script running in Blender that grabs the quaternion data from OSC, and rotates the bones of a 3D armature.

The idea would be releasing a basic client in the form of a Blender add-on responsible for:

  • Establishing some handshake communication with the hub, checking...
Read more »


Hub program of the Chordata open source motion capture system

gzip - 2.23 MB - 08/22/2018 at 19:22


Sensing hardware of the Chordata open source motion capture system. KICAD project

Zip Archive - 102.48 kB - 08/22/2018 at 19:22


ID plugin for the KCeptor units of the Chordata open source motion capture system. KICAD project

Zip Archive - 7.87 kB - 08/22/2018 at 19:22


Hub hardware of the Chordata open source motion capture system. KICAD project

Zip Archive - 145.81 kB - 08/22/2018 at 19:22


Blender add-on allows you to receive, record, and retransmit physical motion capture data coming from a Chordata open source motion capture system

Zip Archive - 35.38 kB - 08/22/2018 at 19:22


View all 6 files

  • 1 × Raspberry Pi 3 or other SBC
  • 15 × LSM9DS1 (Number of units for an average full body suit) Semiconductors and Integrated Circuits / Misc. Semiconductors and Integrated Circuits
  • 15 × LTC4316 (Number of units for an average full body suit) i2c address translator
  • 36 × RJ12 6p6c connector (Number of units for an average full body suit) Passive
  • 15 × 240C2C (Number of units for an average full body suit) EEPROM memory

View all 8 components

  • Upcoming documentation

    Bruno Laurencich09/28/2018 at 16:58 0 comments

    These days we are working hard to prepare the documentation. We are expecting to publish a big part of it by the end of October, so stay tuned for detailed explanations on how the system works and how it can be built!

    In the meantime, we wanted to share with you an image of the cutting edge technology application we use for note-taking  :P

  • Mocap as a sound controller

    Bruno Laurencich09/17/2018 at 09:00 0 comments

    Chordata was born several years ago with the idea of using it to create art from human motion. It took a long time of technical development, but now that the system is up and running we're having so much fun collaborating with several artists to test the possibilities of the system on different disciplines.

    For example last weekend we were with our friends of Creative Coding Roma using the suit as a sound controller for Livecoding with Supercollider. Here's a little teaser of what we made:

    Having the possibility of test it on real use cases and get feedback from real artists is invaluable to keep improving the usability and stability of the system!

    A very special thank to Sergio and Valentina from Orange8, who provided us a really cool location in an old church for experimenting in Gaeta, Italy.

  • The testbed!

    Bruno Laurencich09/07/2018 at 08:47 0 comments

    One particularity on the development of this project is the amount of units needed on every prototype. Everytime we what to test something new we have to build around 15 sensing units (K-Ceptors). Anyone who has tried hand soldering SMD components know how easy is to get it wrong. So we are proud to introduce our new assistant in the lab: The testbed!

    It allows us to troubleshoot the boards as soon as we take them out of the oven, saving us incredible amounts of time (and tears).

    We would really like to thank Sparkfun Electronics for the inspiration they gave us by publishing their production process, and showing (amoung many other things) how they test their widgets with a pogobed.

  • New model topology and rigging

    Bruno Laurencich08/24/2018 at 14:02 0 comments

    Since we’ve focused our communication efforts in an electronic-oriented website, we’ve been omitting an important part of our project: the 3D model and rigging we use to visualize the captures.

    At the beginning we used a slightly modified version of a female model that we downloaded from blendswap.

    The model was originally created by AlexanderLee and we really liked its shape, but it was not optimized for real time visualization, and the rigging didn’t match our requirements. This meant that Bruno (Chordata’s founder and a 3D animator himself) had to apply some changes on the fly and in a rush. Then, as it usually happens, we kept recycling the model that resulted from those quick changes.

    That kept being a limitation until one day François showed up offering his 3D modeling and rigging skills. He did a great job adapting the model to be used in real-time motion capture.

    The changes he made are subtle, and almost invisible to those who are not specialized on 3D modeling:

    -Retopology: This is the process of rearranging the flow of polygons on a mesh to better respond to certain deformations. Check out the comparison images below: do you note how smooth the formation in the front area of the shoulder is with the new topology?

    Old topology

    New topology

    -New skinning and weight paint: The “weights” are just values inside each vertex that determine the amount of deformation each bone produces. To correctly set these values a process similar to how spray painting is done. It’s a long process that requires continuous adjustments on several areas over and over again…

    Thanks François for this great contribution! Even if the captures can be applied to any arbitrary model and rigging, having a good default one for visualization is a great improvement that will allow us to improve Chordata and test it in better conditions.

  • New network infrastructure​

    Bruno Laurencich08/24/2018 at 13:53 0 comments

    From the work done during our motion capture hackathon last month, our new collaborator Antonio Ianiero (iant ) made some interesting modifications to the networking infrastructure.

    Before these, an external router (or smartphone as a hotspot) had to be used to create a small ad-hoc network, to which the microcomputer and the client PC had to connect.

    Antonio saw this solution as impractical and inefficient. Instead he configured the microcomputer to act as an access point: on powering it exposes an SSID to which any WIFI capable device can connect.

    In this way, not only do we eliminate unnecessary intermediate in the network, but also the portability of the system is considerably improved. For example, to capture in an external environment you only need to carry a laptop with charged batteries.

    Thanks to Antonio for such a huge boost!

  • Release Note [v0.1.0 ALPHA]

    Bruno Laurencich08/22/2018 at 19:30 0 comments

    Dear Chordata followers and collaborators, we have some awesome news to share with all of you: the day has come when we finally release the core of Chordata’s motion capture system. In this initial release, we’ll make our code repositories public so that anyone with a little bit of electronic and/or coding experience can go ahead an dive into the intricacies of our project. Those brave enough will also be able to download set up their own motion capture system by downloading the sources, and building the system from scratch. There’s at the moment no official documentation, but we’ll be happy to assist those adventure seekers that wish to take their chance at building Chordata with the materials at hand.

    Be aware that this is an alpha release: we’ll keep on testing and improving the system in the next months (and we’ll obviously let you know about all of the improvements to come). One of the things that drives us when displaying the code is finally being able to expand our scope of collaborators, so feel free to write us through Hackaday discussion or with the form that you can find in our website.

    That’s all fine and dandy, but there will be more to come. We’re preparing Chordata’s documentation so that anyone can access the core functionalities without the need of any sort of expertise. We’re also preparing a Tindie store in which you’ll be able to purchase our pre-built kits: this will enable people without knowledge of electronics to build and use Chordata so that they can apply its functionalities in their personal projects.

    What this means is that we’re just beginning, as Chordata’s purpose is reaching both those who already work with electronics and the general public so that the worlds of visual art, game design, animation, movement computing, gait analysis, physical therapies, among others, can also benefit from the possibility of capturing motion in a more accessible and open system. We have no official release date, but we expect all of these additional releases to be done during the next semester.

    Link to the sources repositories:

    Or download the working files at:

    Don’t hesitate to write us with all of your doubts (or simply to express your appreciation, as that’s what drives us further). We’re eager to see your reaction!

  • New video, web and social

    Bruno Laurencich08/14/2018 at 18:27 0 comments

    Today we've launched a new video explaining all the features of Chordata, the Open Source motion capture system that you can build yourself. Make sure to check it out in our YouTube page and give it a like.

    We’ve also launched our new website, which features links to all of our social networks. With all of these tools, you’ll now be able to follow the evolution of Chordata without missing a single step of the process.

    We hope you enjoy them!

    All this couldn’t been possible without the work of our new team members, Juancho and Flavia, who are helping out with communications and social media.

    If you want to give a little help to the project, is as easy as giving it a like here on hackaday, which will help us to stand out on the Human-Computer Interface round, in the current edition of the Hackaday price.

  • Hackaton postmortem

    Bruno Laurencich07/29/2018 at 14:45 0 comments

    The motion capture hackathon took place last week . It was great to see the Chordata system used on so many different projects!

    Here's a quick recap of the work done:

    Arianna: a great prototype for an easy low cost DIY sensor attachment suit.

    Antonio: networking improvements. SBC as an access point.

    Emanuele: 3D modelling in Blender with mocap

    Lorenzo and Matteo: Mocap suit as musical controller in Max/MSP

    Kine: thanks for soldering all those sensors!

    Massimo and Sara: processing SDK foundations. This gives the user a simplified interface to work with the mocap information.

    Mauro and Alexia: Mocap suit as a musical controller in a Supercollider

    Stefano: Unity 3D integration. Making visuals to go with the music.

    Andrea: Some mysterious project (as usual)

    We gathered lots of information about bugs, and possible interface improvements!

    Not to forget, of course... The winner of the hackathon by unanimous decision: Valerio and his dinosaur capturing tail prosthesis!!

    Many thanks to all the partecipants. We had a great time!

  • Motion capture Hackathon

    Bruno Laurencich07/10/2018 at 15:03 0 comments

    On July the 21th the first Chordata motion capture Hackathon will take place!

    It will be an occasion for users to test the system before the release, and for as to catch some bugs ;).

    The idea is bring together performative artists and digital creators to explore the possibilities of mocap as an artistic discipline.

    It will take place in Rome, at Workco coworking. If you live somewhere else a stream of capture data will be available to let you work from home!

    If you want to participate fill the here, or just leave a comment below.

  • The first release is coming!

    Bruno Laurencich05/25/2018 at 17:32 0 comments

    In the last months we were working hard to achieve the stability in the capture required in any real world scenario, and to be honest the results are not bad at all. So it’s time now to stop with this “improving stability” direction and focus on making the minimal interface changes in order to release it.
    We kept it to ourselves all this time because we felt like there was no sense in publishing a shaky, or hardly usable system. But at this point it would be great to have other hands rather than ours using and testing it, finding bugs or imagining a better user experience.

    So, we set august 2018 as a first release date. Looking forward to it!


    In the meantime we’ve been showcasing in a couple of places:

    Mmkm STUDIO @ Open House Roma.

    The friends of mmkm studio invited us to present a little performance with dance, music, and live-generated visuals. It was the first time we used the Chordata suit for what it was originally conceived for.. Three years ago!

    But it was also the first time the suit was used on a real “stage” in front of other people, having to deliver correct results during a definite period of time, and it went pretty well. This was the kind of proof we needed to convince ourselves that the little bird is ready to leave the nest.

    Hope you like it:

    For this presentation, and most of the time we are using a 3D model by AlexanderLee,  thanks Alexander for sharing it!

    Codemotion Rome:

    This is one of the most important programmers conference in Europe. It was great to be able to share the guts of the software implementation with so many people, we got tons of interesting opinions and advices.

    We would like to thank Ugo Laurenti and Sara Di Bartolomeo for been there helping out with the stand, and also Flavia Laurencich and Juancho Casañas for the support.

    Thanks a lot!

View all 14 project logs

  • 1
    State of the project

    We have released all the working files and sources. You can get them by visiting our online repository, or in this project page.

    Be aware that this is an alpha release: we’ll keep on testing and improving the system in the next months (and we’ll obviously let you know about all of the improvements to come). One of the things that drives us when displaying the code is finally being able to expand our scope of collaborators, so feel free to write us through Hackaday discussion or with the form that you can find at our website.

    We’re preparing Chordata’s documentation so that anyone can access the core functionalities without the need of any sort of expertise. We’re also preparing a Tindie store in which you’ll be able to purchase our pre-built kits: this will enable people without knowledge of electronics to build and use Chordata so that they can apply its functionalities in their personal projects.

  • 2
    Get the hardware

    a. (The easy way, coming soon..):

    Buy pre assembled kits at our Tindie store. Each one comes with the board and some pre-soldered components, and includes a few additional components to be soldered by you.

    b.  (The hard way):

    Download the sources, send them to your favorite PCB manufacturer. Buy the components and solder everything together.

  • 3
    Additional hardware:

    Apart from our hardware you will need a regular Microcomputer like the Raspberry pi. All the sensors and the Microcomputer go fixed to the performer’s body, and connected together with telephone-like wires (6 cores, ~26awg, with crimped rj-12 connectors)

View all 7 instructions

Enjoy this project?



Ember Leona wrote 2 days ago point

Hey Sounds great I really like the sound controller idea. Connect with Yoshimi[zynaddsubfx] and perhaps make a kukurukuri like game.  I though I would share this but you need to hack the wiimote but this was a great idea since 2007 Im suprised games havent used it. I wish I could code better and use DEVKITPRO.

  Are you sure? yes | no

Ember Leona wrote 10/08/2018 at 05:46 point

I think the quality of the video makes it harder to motion capture maybe downgrading video or low ISO or black and white with lights can help motion capture at least in a 2d projected sense. Have you seen that wiimote hack by john on youtube. He flipped the wiimote IR camera around and put IR leds in the eyeglasses I want to use this method but without the wiimote. With this head tracking you can animate the 4d space inside of a TV like when you move you head and look out a small with or make a circle or square with your hands and look through it while moving your head its somekind of parallax effect.

  Are you sure? yes | no

Ember Leona wrote 10/08/2018 at 05:47 point

 if you want to reduce latency

  Are you sure? yes | no

Bruno Laurencich wrote 5 days ago point

I'm not sure I completely understand what are you talking about. This project relies in inertial and magnetic sensors to achieve the capture and has no visual input. In one of our videos (the one with some outdoors takes) there is in fact some annoying latency, they were made on the rush using a crappy laptop as a client and the registered captures resulted  slightly out sync. We are making some modification in our client software in order for it to prevent these syncing issues on the recording, but the core system has no noticeable delay when capturing live, as you can see on this video:

  Are you sure? yes | no

Ember Leona wrote 10/08/2018 at 05:42 point

How much would this cost me? Also I had an idea for hand HumanInterfaceDevice I called binaryFingers I wanted to try mocap with reflected lights or maybe IR leds that can stick to face. That idea is more for facial animations. might have it or or 2

  Are you sure? yes | no

Bruno Laurencich wrote 5 days ago point

We are preparing our tindie store were you will be able to purchase pre assembled kits. We expect to have it running by the end of 2018. In the meantime you have all the information necessary to build your own suit on this project page, our website, or gitlab repositories. Is not an expensive suit at all

  Are you sure? yes | no

nenadvkc wrote 08/18/2018 at 22:38 point

This is amazing, are you considering adding position capture to your system?

  Are you sure? yes | no

Bruno Laurencich wrote 08/19/2018 at 08:36 point

When working with inertial-based capture, position tracking is kind of an indirect feature, it depends on having a good inner-pose tracking. At the moment we are working on delivering a solid inner-pose, and a smooth user experience. We expect to achieve these goals with our first release. After that, position tracking is one of the main objectives.

  Are you sure? yes | no

Ember Leona wrote 2 days ago point

So these are gyroscopes? Like Accelerometers in phone?

  Are you sure? yes | no

JAMES MARK WILSON wrote 07/18/2018 at 05:34 point

great stuff

  Are you sure? yes | no

Bruno Laurencich wrote 07/20/2018 at 10:11 point

thanks :)

  Are you sure? yes | no

Sophi Kravitz wrote 07/16/2018 at 18:14 point

Pretty exciting project... def interested!

  Are you sure? yes | no

Bruno Laurencich wrote 07/20/2018 at 10:11 point

Honored by your interest Sophi, I saw you doing pretty cool stuff around here!

  Are you sure? yes | no

Patrick Lowry wrote 07/16/2018 at 15:30 point

Awesome, guys.  This is something I've been thinking about for a while now, so very please to see someone actually doing it.   I'm keen on mixing the mocap with VR.  I look forward to you release the instructions in August.  Best of luck.

  Are you sure? yes | no

Bruno Laurencich wrote 07/16/2018 at 17:10 point

Thanks! looking forward to see what you can do with it!

  Are you sure? yes | no

Francois Medina wrote 05/02/2018 at 15:10 point

Amazing project! I am eager to build mine alongside the instruction when it will be release.I am an absolute zero at these electronic things but I would love to build a mocap suit for 3D animation. As a 3D artist, if I can help you in some ways, please reach out to me.

  Are you sure? yes | no

Bruno Laurencich wrote 05/25/2018 at 17:20 point

hi Francois, I have good news for you, the first release is coming! The idea is to create a system that can be assembled with no previous knowledge, only the will to do it ;)

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates