Chordata is a motion capture system that you can build yourself. Our goal is taking the world of motion capture to the universe of DIY technologies with an Open Hardware approach.

For a quick introduction you can visit our website.

Detailed technical information, building instructions and user manual are available at our wiki.

If you have any doubts, of just want to share your thoughts on the project, join the discussion at our forum.

Or perhaps you prefer to dive directly into the CODE or KICAD sources at our repositories at gitlab.


🚀 Kickstarter Campaign [120% funded! ⚡️🔥]

The Chordata Motion Kickstarter campaign has raised over €42.000, that's a 120% of our initial funding goal. We’re blown away by the incredible response we received.
Thank you all for the support you've given this system!


Check out the campaign here: http://chordata.com/kickstarter

The project at a Glance:

you can get an idea of what Chordata is all about with the infographic below

Why Chordata was created

The origin of Chordata was a basic need. Bruno, our tech lead, wanted a way to register dance moves for a performance piece, but none of the tools available matched his needs (nor his budget). A lot has happened since then: now the system is publicly available (as a BETA release), and lots of documentation can be found on the sites described above.

Just for the record we leave the original description of the project below, as it was written when the main parts of the system were under development.


Original description of the project:

This project consists on three parts:

Hardware (K-Ceptor):

Motion capture is about getting the orientation of every body limb or part at real time as accurate as possible. A simple MEMS IMU device*, and freely available sensor fusion algorithms are enough to get a decent result. The problem starts when you want to get the data of several devices. Most of this devices came with an i2c interface, but their address is fixed in the hardware. So one of the building blocks of Chordata is the sensing unit capable of coexisting with several “siblings” on the same bus: the “K-Ceptor” It consists of a LSM9DS1 IMU, and a LTC4316 i2c address translator.

While developing and prototyping  we hand-soldered lots of these boards in house, so having the minimum possible number of SMD components per board made that process a little easier.

Software (Notochord):

Getting the data of a lot of sensors on real time, processing it, and send it in an easy-to-read format to some client is not a simple job, so I’m developing a software from scratch to deal with it.

It is responsible for:

After several testing I discovered that using an Single Board Computer running linux was the best choice to host such a program, so all of the development of this part of the software has been done on C++, using a Raspberry Pi 3 as the hub. Some of the advantages of this type of hub, in comparison with simpler microcontrollers are:

The choice of performing the sensor fusion inside the hub is based on:

Software (Client):

Since the protocol with which the data is transmitted is clear, the client can be anything that is capable of displaying a 3D skeleton.  

Most of the time I'm using a python script running in Blender that grabs the quaternion data from OSC, and rotates the bones of a 3D armature.

The idea would be releasing a basic client in the form of a Blender add-on responsible for:

*For the sake of simplicity here I refer to IMU device, but to be correct I should say IMU (gyroscope and accelerometer) + magnetometer