Close
0%
0%

Motion Capture system that you can build yourself

An open hardware-software framework based on inertial sensors that anyone can use to build a human motion capture system.

Similar projects worth following
A couple of years ago I wanted to make a digital performance with a dancer on stage, and wanted to use a mocap suit to get his movements. There was none available at an affordable price, so I started the development of this one.

In the meantime, several cheaper options came out, but those remain out of the capability of most users and, more importantly, they work under proprietary licenses. You can’t modify the way they work, or use part of them in another software project.

As an alternative, this is a motion capture system that can be easily assembled by anyone in order for them to start capturing as soon as they are able to build it. Additionally, it is an open hardware-software framework that can be freely tweaked, enhanced, or used as part of another project.

Chordata is a motion capture system that you can build yourself. Our goal is taking the world of motion capture to the universe of DIY technologies with an Open Hardware approach.

For a quick introduction you can watch the video below, or visit our website.

Detailed technical information, building instructions and user manual are available at our wiki.

Or perhaps you prefer to dive directly into the code and KICAD sources at our repositories at gitlab.

The project at a Glance:

you can get an idea of what Chordata is all about with the infographic below

Why Chordata was created

The origin of Chordata was a basic need. Bruno, our tech lead, wanted a way to register dance moves for a performance piece, but none of the tools available matched his needs (nor his budget). A lot has happened since then: now the system is publicly available (as an ALPHA release for now), and lots of documentation can be found on the sites described above.

Just for the record we leave the original description of the project below, as it was written when the main parts of the system were under development.


Original description of the project:

This project consists on three parts:

Hardware (K-Ceptor):

Motion capture is about getting the orientation of every body limb or part at real time as accurate as possible. A simple MEMS IMU device*, and freely available sensor fusion algorithms are enough to get a decent result. The problem starts when you want to get the data of several devices. Most of this devices came with an i2c interface, but their address is fixed in the hardware. So one of the building blocks of Chordata is the sensing unit capable of coexisting with several “siblings” on the same bus: the “K-Ceptor” It consists of a LSM9DS1 IMU, and a LTC4316 i2c address translator.

While developing and prototyping  we hand-soldered lots of these boards in house, so having the minimum possible number of SMD components per board made that process a little easier.

Software (Notochord):

Getting the data of a lot of sensors on real time, processing it, and send it in an easy-to-read format to some client is not a simple job, so I’m developing a software from scratch to deal with it.

It is responsible for:

  • Building a digital model of the physical hierarchy of sensors. Initializing the i2c communication on the Hub, and running the configuration routine on each of the sensors.
  • Performing a reading on each of the sensors at the specified refresh rate.
  • Correcting each sensor reading with the deviation obtained on a previous calibration process.
  • Performing a sensor fusion on the corrected sensor reading, obtaining absolute orientation information in form of a quaternion.
  • Sending the orientation data, together with the sensor_id and a timestamp to the client using an open protocol (such as OSC)

After several testing I discovered that using an Single Board Computer running linux was the best choice to host such a program, so all of the development of this part of the software has been done on C++, using a Raspberry Pi 3 as the hub. Some of the advantages of this type of hub, in comparison with simpler microcontrollers are:

  • It’s not an expensive component.
  • Programming and debugging is enormously simplified.
  • Some of them, like the rPi3, came out of the box with all the communication peripherals needed to perform a confortable capture, with the remarkable example of the Wifi adapter.

The choice of performing the sensor fusion inside the hub is...

Read more »

Chordata_blender_client_v0_1_0a.zip

Blender add-on allows you to receive, record, and retransmit physical motion capture data coming from a Chordata open source motion capture system

Zip Archive - 12.59 MB - 10/27/2018 at 10:59

Download

2018_10_21-Chordata Technical documentation.pdf

A detailed description of how the Chordata system is implemented. It contains: parts specifications, functional diagrams, schematics, protocol descriptions, power considerations, and more. All this content can also be found on the "How it works" section of the Chordata wiki: http://wiki.chordata.cc/wiki/How_it_works

Adobe Portable Document Format - 1.64 MB - 10/21/2018 at 10:24

Preview
Download

Chordata_BOM_2018_10_20.pdf

Bill of materials for the main Chordata Hardware: -K-Ceptor -Hub -ID Module

Adobe Portable Document Format - 330.14 kB - 10/20/2018 at 16:04

Preview
Download

kc_r2.pdf

Schematics for the K-Ceptor R2

Adobe Portable Document Format - 80.61 kB - 10/18/2018 at 09:40

Preview
Download

005_hub_r1.pdf

Schematics for the Hub R1

Adobe Portable Document Format - 106.16 kB - 10/18/2018 at 09:41

Preview
Download

View all 12 files

  • 1 × Raspberry Pi 3 or other SBC
  • 15 × LSM9DS1 (Number of units for an average full body suit) Semiconductors and Integrated Circuits / Misc. Semiconductors and Integrated Circuits
  • 15 × LTC4316 (Number of units for an average full body suit) i2c address translator
  • 36 × RJ12 6p6c connector (Number of units for an average full body suit) Passive
  • 15 × 240C2C (Number of units for an average full body suit) EEPROM memory

View all 9 components

  • The documentation is available!

    Bruno Laurencich10/21/2018 at 10:53 0 comments

    Dear hackers, we’re happy to inform you that after several weeks working on it, Chordata has its documentation available! Those of you who were eager to build your own motioncapture system, now have a dedicated learning material.

    We choose to implement it as a wiki, since that type of content structure perfectly suits the Chordata philosophy: sharing and constructing knowledge together. So we’ll be glad to hear your suggestions, or receive your contributions to make a bigger and more useful knowledge base.

    You can access the documentation at http://wiki.chordata.cc

    A summary of the User Manual can be found on the instructions section of this project's page. And we have also uploaded here a pdf with a detailed description of how the Chordata system is implemented, it contains: parts specifications, functional diagrams, schematics, protocol descriptions, power considerations, and more. (We will be uploading that content to the wiki on the following days).

    If you have no idea what the Chordata system is about you can just take a look to our new basic infographics.

    Let us know what you think!

  • Upcoming documentation

    Bruno Laurencich09/28/2018 at 16:58 0 comments

    These days we are working hard to prepare the documentation. We are expecting to publish a big part of it by the end of October, so stay tuned for detailed explanations on how the system works and how it can be built!

    In the meantime, we wanted to share with you an image of the cutting edge technology application we use for note-taking  :P


  • Mocap as a sound controller

    Bruno Laurencich09/17/2018 at 09:00 0 comments

    Chordata was born several years ago with the idea of using it to create art from human motion. It took a long time of technical development, but now that the system is up and running we're having so much fun collaborating with several artists to test the possibilities of the system on different disciplines.

    For example last weekend we were with our friends of Creative Coding Roma using the suit as a sound controller for Livecoding with Supercollider. Here's a little teaser of what we made:

    Having the possibility of test it on real use cases and get feedback from real artists is invaluable to keep improving the usability and stability of the system!

    A very special thank to Sergio and Valentina from Orange8, who provided us a really cool location in an old church for experimenting in Gaeta, Italy.

  • The testbed!

    Bruno Laurencich09/07/2018 at 08:47 0 comments

    One particularity on the development of this project is the amount of units needed on every prototype. Everytime we what to test something new we have to build around 15 sensing units (K-Ceptors). Anyone who has tried hand soldering SMD components know how easy is to get it wrong. So we are proud to introduce our new assistant in the lab: The testbed!

    It allows us to troubleshoot the boards as soon as we take them out of the oven, saving us incredible amounts of time (and tears).

    We would really like to thank Sparkfun Electronics for the inspiration they gave us by publishing their production process, and showing (amoung many other things) how they test their widgets with a pogobed.

  • New model topology and rigging

    Bruno Laurencich08/24/2018 at 14:02 0 comments

    Since we’ve focused our communication efforts in an electronic-oriented website, we’ve been omitting an important part of our project: the 3D model and rigging we use to visualize the captures.

    At the beginning we used a slightly modified version of a female model that we downloaded from blendswap.

    The model was originally created by AlexanderLee and we really liked its shape, but it was not optimized for real time visualization, and the rigging didn’t match our requirements. This meant that Bruno (Chordata’s founder and a 3D animator himself) had to apply some changes on the fly and in a rush. Then, as it usually happens, we kept recycling the model that resulted from those quick changes.

    That kept being a limitation until one day François showed up offering his 3D modeling and rigging skills. He did a great job adapting the model to be used in real-time motion capture.

    The changes he made are subtle, and almost invisible to those who are not specialized on 3D modeling:

    -Retopology: This is the process of rearranging the flow of polygons on a mesh to better respond to certain deformations. Check out the comparison images below: do you note how smooth the formation in the front area of the shoulder is with the new topology?

    Old topology

    New topology

    -New skinning and weight paint: The “weights” are just values inside each vertex that determine the amount of deformation each bone produces. To correctly set these values a process similar to how spray painting is done. It’s a long process that requires continuous adjustments on several areas over and over again…

    Thanks François for this great contribution! Even if the captures can be applied to any arbitrary model and rigging, having a good default one for visualization is a great improvement that will allow us to improve Chordata and test it in better conditions.

  • New network infrastructure​

    Bruno Laurencich08/24/2018 at 13:53 0 comments

    From the work done during our motion capture hackathon last month, our new collaborator Antonio Ianiero (iant ) made some interesting modifications to the networking infrastructure.

    Before these, an external router (or smartphone as a hotspot) had to be used to create a small ad-hoc network, to which the microcomputer and the client PC had to connect.

    Antonio saw this solution as impractical and inefficient. Instead he configured the microcomputer to act as an access point: on powering it exposes an SSID to which any WIFI capable device can connect.

    In this way, not only do we eliminate unnecessary intermediate in the network, but also the portability of the system is considerably improved. For example, to capture in an external environment you only need to carry a laptop with charged batteries.

    Thanks to Antonio for such a huge boost!

  • Release Note [v0.1.0 ALPHA]

    Bruno Laurencich08/22/2018 at 19:30 0 comments

    Dear Chordata followers and collaborators, we have some awesome news to share with all of you: the day has come when we finally release the core of Chordata’s motion capture system. In this initial release, we’ll make our code repositories public so that anyone with a little bit of electronic and/or coding experience can go ahead an dive into the intricacies of our project. Those brave enough will also be able to download set up their own motion capture system by downloading the sources, and building the system from scratch. There’s at the moment no official documentation, but we’ll be happy to assist those adventure seekers that wish to take their chance at building Chordata with the materials at hand.

    Be aware that this is an alpha release: we’ll keep on testing and improving the system in the next months (and we’ll obviously let you know about all of the improvements to come). One of the things that drives us when displaying the code is finally being able to expand our scope of collaborators, so feel free to write us through Hackaday discussion or with the form that you can find in our website.

    That’s all fine and dandy, but there will be more to come. We’re preparing Chordata’s documentation so that anyone can access the core functionalities without the need of any sort of expertise. We’re also preparing a Tindie store in which you’ll be able to purchase our pre-built kits: this will enable people without knowledge of electronics to build and use Chordata so that they can apply its functionalities in their personal projects.

    What this means is that we’re just beginning, as Chordata’s purpose is reaching both those who already work with electronics and the general public so that the worlds of visual art, game design, animation, movement computing, gait analysis, physical therapies, among others, can also benefit from the possibility of capturing motion in a more accessible and open system. We have no official release date, but we expect all of these additional releases to be done during the next semester.

    Link to the sources repositories: https://gitlab.com/chordata

    Or download the working files at: https://hackaday.io/project/27519-motion-capture-system-that-you-can-build-yourself#menu-files

    Don’t hesitate to write us with all of your doubts (or simply to express your appreciation, as that’s what drives us further). We’re eager to see your reaction!

  • New video, web and social

    Bruno Laurencich08/14/2018 at 18:27 0 comments

    Today we've launched a new video explaining all the features of Chordata, the Open Source motion capture system that you can build yourself. Make sure to check it out in our YouTube page and give it a like.

    We’ve also launched our new website, which features links to all of our social networks. With all of these tools, you’ll now be able to follow the evolution of Chordata without missing a single step of the process.

    We hope you enjoy them!

    All this couldn’t been possible without the work of our new team members, Juancho and Flavia, who are helping out with communications and social media.

    If you want to give a little help to the project, is as easy as giving it a like here on hackaday, which will help us to stand out on the Human-Computer Interface round, in the current edition of the Hackaday price.

  • Hackaton postmortem

    Bruno Laurencich07/29/2018 at 14:45 0 comments

    The motion capture hackathon took place last week . It was great to see the Chordata system used on so many different projects!

    Here's a quick recap of the work done:


    Arianna: a great prototype for an easy low cost DIY sensor attachment suit.

    Antonio: networking improvements. SBC as an access point.

    Emanuele: 3D modelling in Blender with mocap

    Lorenzo and Matteo: Mocap suit as musical controller in Max/MSP

    Kine: thanks for soldering all those sensors!

    Massimo and Sara: processing SDK foundations. This gives the user a simplified interface to work with the mocap information.

    Mauro and Alexia: Mocap suit as a musical controller in a Supercollider

    Stefano: Unity 3D integration. Making visuals to go with the music.

    Andrea: Some mysterious project (as usual)

    We gathered lots of information about bugs, and possible interface improvements!

    Not to forget, of course... The winner of the hackathon by unanimous decision: Valerio and his dinosaur capturing tail prosthesis!!


    Many thanks to all the partecipants. We had a great time!

  • Motion capture Hackathon

    Bruno Laurencich07/10/2018 at 15:03 0 comments

    On July the 21th the first Chordata motion capture Hackathon will take place!

    It will be an occasion for users to test the system before the release, and for as to catch some bugs ;).

    The idea is bring together performative artists and digital creators to explore the possibilities of mocap as an artistic discipline.

    It will take place in Rome, at Workco coworking. If you live somewhere else a stream of capture data will be available to let you work from home!

    If you want to participate fill the here, or just leave a comment below.

View all 15 project logs

  • 1
    State of the project

    We have released all the working files and sources. You can get them by visiting our online repository, or in this project page.

    Be aware that this is an alpha release: we’ll keep on testing and improving the system in the next months (and we’ll obviously let you know about all of the improvements to come). One of the things that drives us when displaying the code is finally being able to expand our scope of collaborators, so feel free to write us through Hackaday discussion or with the form that you can find at our website.

    In this project page you will be able to find a quick description of the main steps needed to build and use your own motion capture gear, if you want to get the full picture please visit our online documentation.

    We’re also preparing a Tindie store in which you’ll be able to purchase our pre-built kits: this will enable people without knowledge of electronics to build and use Chordata so that they can apply its functionalities in their personal projects.

  • 2
    Get the Chordata hardware
    Chordata parts
    Chordata specific hardware

    The Chordata specific hardware is compound of three different parts.

    Theoretically speaking they can be arranged forming arbitrary hierarchies, but if you want to capture human movements, then you will be using a default node configuration most of the time. The default biped configuration uses 15 K-Ceptors, 15 ID Modules and 1 Hub. You can get them through one of the the following ways:

    a. (The easy way, coming soon..):

    Buy pre assembled kits at our Tindie store. Each one comes with the board and some pre-soldered components (all the difficult ones), and includes a few additional components to be soldered by you.

    b. (The hard way):

    Start by downloading the sources at our website or gitlab repository and send them to your favorite PCB manufacturer.

    Buy all the components, the BOM can be found at the downloads section of this project page.

    Solder everything together. The current version of the Chordata hardware uses as many THT components as possible, but there are some tiny SMD ones to be soldered as well.

    See the Chordata parts chapter on our documentation for more details.

  • 3
    Additional hardware: SBC
    Raspberry pi and Chordata Hub

    Apart from our hardware you will need a regular Microcomputer like the Raspberry pi (microcomputer is just a colloquial name, technically is called SBC).

    The software part of the Chordata system is composed of several programs, and most of them should be running on the SBC. The easier way to get them all, and configure them correctly, is to download the custom Linux (Raspbian based) image that will soon be available at our webpage.

    The process of flashing is really simple using the dedicated tool Etcher, available for Windows, Mac and Linux. Start by downloading from their webpage, and installing it.

    • Insert the SD card on your computer’s card reader.
    • On Etcher select the downloaded file, select your SD card as a destination disk and hit Flash!

    See the Microcomputer chapter on our documentation for more details.

View all 14 instructions

Enjoy this project?

Share

Discussions

Ember Leona wrote 10/08/2018 at 05:46 point

I think the quality of the video makes it harder to motion capture maybe downgrading video or low ISO or black and white with lights can help motion capture at least in a 2d projected sense. Have you seen that wiimote hack by john on youtube. He flipped the wiimote IR camera around and put IR leds in the eyeglasses I want to use this method but without the wiimote. With this head tracking you can animate the 4d space inside of a TV like when you move you head and look out a small with or make a circle or square with your hands and look through it while moving your head its somekind of parallax effect.

  Are you sure? yes | no

Ember Leona wrote 10/08/2018 at 05:47 point

 if you want to reduce latency

  Are you sure? yes | no

Bruno Laurencich wrote 10/12/2018 at 10:04 point

I'm not sure I completely understand what are you talking about. This project relies in inertial and magnetic sensors to achieve the capture and has no visual input. In one of our videos (the one with some outdoors takes) there is in fact some annoying latency, they were made on the rush using a crappy laptop as a client and the registered captures resulted  slightly out sync. We are making some modification in our client software in order for it to prevent these syncing issues on the recording, but the core system has no noticeable delay when capturing live, as you can see on this video: https://youtu.be/vp6J6rabenk

  Are you sure? yes | no

Ember Leona wrote 10/08/2018 at 05:42 point

How much would this cost me? Also I had an idea for hand HumanInterfaceDevice I called binaryFingers I wanted to try mocap with reflected lights or maybe IR leds that can stick to face. That idea is more for facial animations. openInvent.club might have it or tiny.cc/openInvent1 or 2

  Are you sure? yes | no

Bruno Laurencich wrote 10/12/2018 at 10:11 point

We are preparing our tindie store were you will be able to purchase pre assembled kits. We expect to have it running by the end of 2018. In the meantime you have all the information necessary to build your own suit on this project page, our website, or gitlab repositories. Is not an expensive suit at all

  Are you sure? yes | no

nenadvkc wrote 08/18/2018 at 22:38 point

This is amazing, are you considering adding position capture to your system?

  Are you sure? yes | no

Bruno Laurencich wrote 08/19/2018 at 08:36 point

When working with inertial-based capture, position tracking is kind of an indirect feature, it depends on having a good inner-pose tracking. At the moment we are working on delivering a solid inner-pose, and a smooth user experience. We expect to achieve these goals with our first release. After that, position tracking is one of the main objectives.
Thanks!!

  Are you sure? yes | no

Ember Leona wrote 10/15/2018 at 06:10 point

So these are gyroscopes? Like Accelerometers in phone?

  Are you sure? yes | no

JAMES MARK WILSON wrote 07/18/2018 at 05:34 point

great stuff

  Are you sure? yes | no

Bruno Laurencich wrote 07/20/2018 at 10:11 point

thanks :)

  Are you sure? yes | no

Sophi Kravitz wrote 07/16/2018 at 18:14 point

Pretty exciting project... def interested!

  Are you sure? yes | no

Bruno Laurencich wrote 07/20/2018 at 10:11 point

Honored by your interest Sophi, I saw you doing pretty cool stuff around here!

  Are you sure? yes | no

Patrick Lowry wrote 07/16/2018 at 15:30 point

Awesome, guys.  This is something I've been thinking about for a while now, so very please to see someone actually doing it.   I'm keen on mixing the mocap with VR.  I look forward to you release the instructions in August.  Best of luck.

  Are you sure? yes | no

Bruno Laurencich wrote 07/16/2018 at 17:10 point

Thanks! looking forward to see what you can do with it!

  Are you sure? yes | no

Francois Medina wrote 05/02/2018 at 15:10 point

Amazing project! I am eager to build mine alongside the instruction when it will be release.I am an absolute zero at these electronic things but I would love to build a mocap suit for 3D animation. As a 3D artist, if I can help you in some ways, please reach out to me.

  Are you sure? yes | no

Bruno Laurencich wrote 05/25/2018 at 17:20 point

hi Francois, I have good news for you, the first release is coming! The idea is to create a system that can be assembled with no previous knowledge, only the will to do it ;)

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates