Close
0%
0%

Motion Capture system that you can build yourself

An open hardware-software framework based on inertial sensors that anyone can use to build a human motion capture system.

Similar projects worth following
A couple of years ago I wanted to make a digital performance with a dancer on stage, and wanted to use a mocap suit to get his movements. There was none available at an affordable price, so I started the development of this one.

In the meantime, several cheaper options came out, but those remain out of the capability of most users and, more importantly, they work under proprietary licenses. You can’t modify the way they work, or use part of them in another software project.

As an alternative, this is a motion capture system that can be easily assembled by anyone in order for them to start capturing as soon as they are able to build it. Additionally, it is an open hardware-software framework that can be freely tweaked, enhanced, or used as part of another project.

Chordata is a motion capture system that you can build yourself. Our goal is taking the world of motion capture to the universe of DIY technologies with an Open Hardware approach.

For a quick introduction you can watch the video below, or visit our website.

Detailed technical information, building instructions and user manual are available at our wiki.

Or perhaps you prefer to dive directly into the code and KICAD sources at our repositories at gitlab.

If you have any doubts, of just want to share your thoughts on the project, join the discussion at our forum.


The application process to be part of our 2nd Beta-testing program now open!
If you are interested on testing a prototypical version of the system please fill the form on this page .
We can only offer 20 vacants, so don’t miss it!

The project at a Glance:

you can get an idea of what Chordata is all about with the infographic below

Why Chordata was created

The origin of Chordata was a basic need. Bruno, our tech lead, wanted a way to register dance moves for a performance piece, but none of the tools available matched his needs (nor his budget). A lot has happened since then: now the system is publicly available (as a BETA release), and lots of documentation can be found on the sites described above.

Just for the record we leave the original description of the project below, as it was written when the main parts of the system were under development.


Original description of the project:

This project consists on three parts:

Hardware (K-Ceptor):

Motion capture is about getting the orientation of every body limb or part at real time as accurate as possible. A simple MEMS IMU device*, and freely available sensor fusion algorithms are enough to get a decent result. The problem starts when you want to get the data of several devices. Most of this devices came with an i2c interface, but their address is fixed in the hardware. So one of the building blocks of Chordata is the sensing unit capable of coexisting with several “siblings” on the same bus: the “K-Ceptor” It consists of a LSM9DS1 IMU, and a LTC4316 i2c address translator.

While developing and prototyping  we hand-soldered lots of these boards in house, so having the minimum possible number of SMD components per board made that process a little easier.

Software (Notochord):

Getting the data of a lot of sensors on real time, processing it, and send it in an easy-to-read format to some client is not a simple job, so I’m developing a software from scratch to deal with it.

It is responsible for:

  • Building a digital model of the physical hierarchy of sensors. Initializing the i2c communication on the Hub, and running the configuration routine on each of the sensors.
  • Performing a reading on each of the sensors at the specified refresh rate.
  • Correcting each sensor reading with the deviation obtained on a previous calibration process.
  • Performing a sensor fusion on the corrected sensor reading, obtaining absolute orientation information in form of a quaternion.
  • Sending the orientation data, together with the sensor_id and a timestamp to the client using...
Read more »

Chordata_notochord_v0_1_1b.tar.gz

Hub program of the Chordata open source motion capture system. 0.1.1b Changelog: ========= - "Scan" feature - Fix bug with acelerometer and gyroscope calibration - Validate acel and gyro calibration - Improve calibration CLI UX - Add higher Sensor data rate (experimental)

gzip - 844.67 kB - 05/21/2019 at 16:28

Download

Chordata_BOM_R2-2_2019_02_4.pdf

Complete bill of materials

Adobe Portable Document Format - 223.90 kB - 02/12/2019 at 13:34

Preview
Download

Chordata-K_ceptor-R2.2.zip

Sensing hardware of the Chordata open source motion capture system. KICAD project

Zip Archive - 100.77 kB - 02/12/2019 at 13:16

Download

Chordata-Hub-R1.2.zip

Hub hardware of the Chordata open source motion capture system. KICAD project

Zip Archive - 142.84 kB - 02/12/2019 at 13:16

Download

Chordata_blender_client_v0_1_0a.zip

Blender add-on allows you to receive, record, and retransmit physical motion capture data coming from a Chordata open source motion capture system

Zip Archive - 12.59 MB - 10/27/2018 at 10:59

Download

View all 12 files

  • 1 × Raspberry Pi 3 or other SBC
  • 15 × LSM9DS1 (Number of units for an average full body suit) Semiconductors and Integrated Circuits / Misc. Semiconductors and Integrated Circuits
  • 15 × LTC4316 (Number of units for an average full body suit) i2c address translator
  • 36 × RJ12 6p6c connector (Number of units for an average full body suit) Passive
  • 15 × M24C01 (Number of units for an average full body suit) EEPROM memory

View all 9 components

  • Capture from a smartphone

    Bruno Laurencich09/12/2019 at 08:30 0 comments

    ⚡ In the past months we were working on improving Chordata’s Remote Console, which allows you to capture data with in a much more easy way, even straight from your smartphone!

    This improvement is something we’ve always desired, as Chordata is designed for both specialized users and beginners. With the Remote Console we’ll make the capture process simpler, thus allowing anyone to use our system easily!

    Of course this is not a replacement for the "advanced" workflow, power users will still be able to access using tools as SSH.

    Here's a demo of it's functionality:)

  • 2nd Beta testing program

    Bruno Laurencich09/03/2019 at 13:46 0 comments

    We’re very excited to inform that we’re preparing a second Beta-testing Program!!!

    You have asked for it so many times, that we decided it’s needed in order to honor your support. If selected, you’ll have access to a prototype of Chordata’s system at cost price.

    This will be the second Beta-testing program we do before our official product launch. If everything goes as planned, Chordata suits will be available through our Tindie store in the first quarter of 2020!

    Because of our current low-scale production method, we can only offer the system to 20 beta-testers.

    Visit this page to know the details: https://chordata.cc/beta-testing/

  • 0.1.1b Notochord software Release

    Bruno Laurencich05/21/2019 at 16:34 0 comments

    During the last weeks we've been working together with a group of  betatesters on a new software release: Chordata's Notochord v0.1.1b is out!

    Apart from many bug fixes, it can now read sensors at double the previous rate: up to 100Hz!

    0.1.1b Changelog:
    =================
    
    - "Scan" feature
    - Fix bug with acelerometer and gyroscope calibration
    - Validate acel and gyro calibration
    - Improve calibration CLI UX
    - Add higher Sensor data rate (experimental)

  • New hardware revision

    Bruno Laurencich02/12/2019 at 13:14 0 comments

    A new  revision of the hardware is available. Includes some fixes and minor changes :

    •  New EEPROM on K-Ceptor (M24C01)
    •  ID_module can be replaced for on-board resistors
    •  Pull-up resistors on i2c channels on Hub
    •  Fix Hub PCA9548 footprint
    •  Fix Hub gates pin order

  • The documentation is available!

    Bruno Laurencich10/21/2018 at 10:53 0 comments

    Dear hackers, we’re happy to inform you that after several weeks working on it, Chordata has its documentation available! Those of you who were eager to build your own motioncapture system, now have a dedicated learning material.

    We choose to implement it as a wiki, since that type of content structure perfectly suits the Chordata philosophy: sharing and constructing knowledge together. So we’ll be glad to hear your suggestions, or receive your contributions to make a bigger and more useful knowledge base.

    You can access the documentation at http://wiki.chordata.cc

    A summary of the User Manual can be found on the instructions section of this project's page. And we have also uploaded here a pdf with a detailed description of how the Chordata system is implemented, it contains: parts specifications, functional diagrams, schematics, protocol descriptions, power considerations, and more. (We will be uploading that content to the wiki on the following days).

    If you have no idea what the Chordata system is about you can just take a look to our new basic infographics.

    Let us know what you think!

  • Upcoming documentation

    Bruno Laurencich09/28/2018 at 16:58 0 comments

    These days we are working hard to prepare the documentation. We are expecting to publish a big part of it by the end of October, so stay tuned for detailed explanations on how the system works and how it can be built!

    In the meantime, we wanted to share with you an image of the cutting edge technology application we use for note-taking  :P


  • Mocap as a sound controller

    Bruno Laurencich09/17/2018 at 09:00 0 comments

    Chordata was born several years ago with the idea of using it to create art from human motion. It took a long time of technical development, but now that the system is up and running we're having so much fun collaborating with several artists to test the possibilities of the system on different disciplines.

    For example last weekend we were with our friends of Creative Coding Roma using the suit as a sound controller for Livecoding with Supercollider. Here's a little teaser of what we made:

    Having the possibility of test it on real use cases and get feedback from real artists is invaluable to keep improving the usability and stability of the system!

    A very special thank to Sergio and Valentina from Orange8, who provided us a really cool location in an old church for experimenting in Gaeta, Italy.

  • The testbed!

    Bruno Laurencich09/07/2018 at 08:47 0 comments

    One particularity on the development of this project is the amount of units needed on every prototype. Everytime we what to test something new we have to build around 15 sensing units (K-Ceptors). Anyone who has tried hand soldering SMD components know how easy is to get it wrong. So we are proud to introduce our new assistant in the lab: The testbed!

    It allows us to troubleshoot the boards as soon as we take them out of the oven, saving us incredible amounts of time (and tears).

    We would really like to thank Sparkfun Electronics for the inspiration they gave us by publishing their production process, and showing (amoung many other things) how they test their widgets with a pogobed.

  • New model topology and rigging

    Bruno Laurencich08/24/2018 at 14:02 0 comments

    Since we’ve focused our communication efforts in an electronic-oriented website, we’ve been omitting an important part of our project: the 3D model and rigging we use to visualize the captures.

    At the beginning we used a slightly modified version of a female model that we downloaded from blendswap.

    The model was originally created by AlexanderLee and we really liked its shape, but it was not optimized for real time visualization, and the rigging didn’t match our requirements. This meant that Bruno (Chordata’s founder and a 3D animator himself) had to apply some changes on the fly and in a rush. Then, as it usually happens, we kept recycling the model that resulted from those quick changes.

    That kept being a limitation until one day François showed up offering his 3D modeling and rigging skills. He did a great job adapting the model to be used in real-time motion capture.

    The changes he made are subtle, and almost invisible to those who are not specialized on 3D modeling:

    -Retopology: This is the process of rearranging the flow of polygons on a mesh to better respond to certain deformations. Check out the comparison images below: do you note how smooth the formation in the front area of the shoulder is with the new topology?

    Old topology

    New topology

    -New skinning and weight paint: The “weights” are just values inside each vertex that determine the amount of deformation each bone produces. To correctly set these values a process similar to how spray painting is done. It’s a long process that requires continuous adjustments on several areas over and over again…

    Thanks François for this great contribution! Even if the captures can be applied to any arbitrary model and rigging, having a good default one for visualization is a great improvement that will allow us to improve Chordata and test it in better conditions.

  • New network infrastructure​

    Bruno Laurencich08/24/2018 at 13:53 0 comments

    From the work done during our motion capture hackathon last month, our new collaborator Antonio Ianiero (iant ) made some interesting modifications to the networking infrastructure.

    Before these, an external router (or smartphone as a hotspot) had to be used to create a small ad-hoc network, to which the microcomputer and the client PC had to connect.

    Antonio saw this solution as impractical and inefficient. Instead he configured the microcomputer to act as an access point: on powering it exposes an SSID to which any WIFI capable device can connect.

    In this way, not only do we eliminate unnecessary intermediate in the network, but also the portability of the system is considerably improved. For example, to capture in an external environment you only need to carry a laptop with charged batteries.

    Thanks to Antonio for such a huge boost!

View all 19 project logs

  • 1
    State of the project

    We have released all the working files and sources. You can get them by visiting our online repository, or in this project page.

    Be aware that this is an Beta release: we’ll keep on testing and improving the system in the next months (and we’ll obviously let you know about all of the improvements to come). One of the things that drives us when displaying the code is finally being able to expand our scope of collaborators, so feel free to write us through Hackaday discussion or with the form that you can find at our website.

    In this project page you will be able to find the original (outdated) description of the main steps needed to build and use your own motion capture gear, if you want to get the full picture please visit our online documentation.

    We’re also preparing a Tindie store in which you’ll be able to purchase our pre-built kits: this will enable people without knowledge of electronics to build and use Chordata so that they can apply its functionalities in their personal projects.

  • 2
    Get the Chordata hardware
    Chordata parts
    Chordata specific hardware

    The Chordata specific hardware is compound of three different parts.

    Theoretically speaking they can be arranged forming arbitrary hierarchies, but if you want to capture human movements, then you will be using a default node configuration most of the time. The default biped configuration uses 15 K-Ceptors, 15 ID Modules and 1 Hub. You can get them through one of the the following ways:

    a. (The easy way, coming soon..):

    Buy pre assembled kits at our Tindie store. Each one comes with the board and some pre-soldered components (all the difficult ones), and includes a few additional components to be soldered by you.

    b. (The hard way):

    Start by downloading the sources at our website or gitlab repository and send them to your favorite PCB manufacturer.

    Buy all the components, the BOM can be found at the downloads section of this project page.

    Solder everything together. The current version of the Chordata hardware uses as many THT components as possible, but there are some tiny SMD ones to be soldered as well.

    See the Chordata parts chapter on our documentation for more details.

  • 3
    Additional hardware: SBC
    Raspberry pi and Chordata Hub

    Apart from our hardware you will need a regular Microcomputer like the Raspberry pi (microcomputer is just a colloquial name, technically is called SBC).

    The software part of the Chordata system is composed of several programs, and most of them should be running on the SBC. The easier way to get them all, and configure them correctly, is to download the custom Linux (Raspbian based) image that will soon be available at our webpage.

    The process of flashing is really simple using the dedicated tool Etcher, available for Windows, Mac and Linux. Start by downloading from their webpage, and installing it.

    • Insert the SD card on your computer’s card reader.
    • On Etcher select the downloaded file, select your SD card as a destination disk and hit Flash!

    See the Microcomputer chapter on our documentation for more details.

View all 14 instructions

Enjoy this project?

Share

Discussions

Laurent wrote 04/06/2019 at 19:32 point

Hy , I'm working in an animation studio , and would like to know if it was possible to test this great tools to help animators to improve theyre hard work , and bring a touch of realty into model's movements . 

thxn in advance :) Realy good project , and as far as I know rPI , arduino and sensors , I'm sure that this kind of afordable tool will become the must have in a near future :)

  Are you sure? yes | no

Bruno Laurencich wrote 05/29/2019 at 10:36 point

Hi Laurent.

The first round of the betatesting is now over. There will be another one on the following months for sure. Be sure to subscribe to our newsletter from https://chordata.cc to get the latest updates.

Otherwise you can always build the hardware by yourself. You will find information on our wiki.... more

  Are you sure? yes | no

Schimpansendodo wrote 03/02/2019 at 14:23 point

Hey Bruno Aswome project :).

Its possible to become a beta tester?

Greetings from Germany

Dominik

  Are you sure? yes | no

Bruno Laurencich wrote 03/06/2019 at 12:24 point

Hello dominik!

We are currently evaluating the last few candidates of this betatesting batch. Please write us with a brief description of your background and how do you plan to use the system and you might still have a chance.

Or, if you don't make it on this round we will probably open a second one on the following months.

more info here: https://forum.chordata.cc/d/5-become-a-beta-tester

  Are you sure? yes | no

eduard.b wrote 11/24/2018 at 01:26 point

Hi Bruno, awesome what are you doing here, i want to ask if you have any costume for sale already ....

  Are you sure? yes | no

Bruno Laurencich wrote 12/27/2018 at 17:14 point

Hi Eduard,

Currently we're preparing an initial prototype offer that we expect to release by January 2019 as part of our first Beta testing program. Keep in mind that there will only be a limited amount of suits produced during this period.

Once the Beta period is over we will start offering the first production release which price and release date is still to be determined.

If you are interested on participating on the beta testing program please get in touch with us.

  Are you sure? yes | no

Ember Leona wrote 10/08/2018 at 05:46 point

I think the quality of the video makes it harder to motion capture maybe downgrading video or low ISO or black and white with lights can help motion capture at least in a 2d projected sense. Have you seen that wiimote hack by john on youtube. He flipped the wiimote IR camera around and put IR leds in the eyeglasses I want to use this method but without the wiimote. With this head tracking you can animate the 4d space inside of a TV like when you move you head and look out a small with or make a circle or square with your hands and look through it while moving your head its somekind of parallax effect.

  Are you sure? yes | no

Ember Leona wrote 10/08/2018 at 05:47 point

 if you want to reduce latency

  Are you sure? yes | no

Bruno Laurencich wrote 10/12/2018 at 10:04 point

I'm not sure I completely understand what are you talking about. This project relies in inertial and magnetic sensors to achieve the capture and has no visual input. In one of our videos (the one with some outdoors takes) there is in fact some annoying latency, they were made on the rush using a crappy laptop as a client and the registered captures resulted  slightly out sync. We are making some modification in our client software in order for it to prevent these syncing issues on the recording, but the core system has no noticeable delay when capturing live, as you can see on this video: https://youtu.be/vp6J6rabenk

  Are you sure? yes | no

Ember Leona wrote 10/08/2018 at 05:42 point

How much would this cost me? Also I had an idea for hand HumanInterfaceDevice I called binaryFingers I wanted to try mocap with reflected lights or maybe IR leds that can stick to face. That idea is more for facial animations. openInvent.club might have it or tiny.cc/openInvent1 or 2

  Are you sure? yes | no

Bruno Laurencich wrote 10/12/2018 at 10:11 point

We are preparing our tindie store were you will be able to purchase pre assembled kits. We expect to have it running by the end of 2018. In the meantime you have all the information necessary to build your own suit on this project page, our website, or gitlab repositories. Is not an expensive suit at all

  Are you sure? yes | no

nenadvkc wrote 08/18/2018 at 22:38 point

This is amazing, are you considering adding position capture to your system?

  Are you sure? yes | no

Bruno Laurencich wrote 08/19/2018 at 08:36 point

When working with inertial-based capture, position tracking is kind of an indirect feature, it depends on having a good inner-pose tracking. At the moment we are working on delivering a solid inner-pose, and a smooth user experience. We expect to achieve these goals with our first release. After that, position tracking is one of the main objectives.
Thanks!!

  Are you sure? yes | no

Ember Leona wrote 10/15/2018 at 06:10 point

So these are gyroscopes? Like Accelerometers in phone?

  Are you sure? yes | no

JAMES MARK WILSON wrote 07/18/2018 at 05:34 point

great stuff

  Are you sure? yes | no

Bruno Laurencich wrote 07/20/2018 at 10:11 point

thanks :)

  Are you sure? yes | no

Sophi Kravitz wrote 07/16/2018 at 18:14 point

Pretty exciting project... def interested!

  Are you sure? yes | no

Bruno Laurencich wrote 07/20/2018 at 10:11 point

Honored by your interest Sophi, I saw you doing pretty cool stuff around here!

  Are you sure? yes | no

Patrick Lowry wrote 07/16/2018 at 15:30 point

Awesome, guys.  This is something I've been thinking about for a while now, so very please to see someone actually doing it.   I'm keen on mixing the mocap with VR.  I look forward to you release the instructions in August.  Best of luck.

  Are you sure? yes | no

Bruno Laurencich wrote 07/16/2018 at 17:10 point

Thanks! looking forward to see what you can do with it!

  Are you sure? yes | no

Francois Medina wrote 05/02/2018 at 15:10 point

Amazing project! I am eager to build mine alongside the instruction when it will be release.I am an absolute zero at these electronic things but I would love to build a mocap suit for 3D animation. As a 3D artist, if I can help you in some ways, please reach out to me.

  Are you sure? yes | no

Bruno Laurencich wrote 05/25/2018 at 17:20 point

hi Francois, I have good news for you, the first release is coming! The idea is to create a system that can be assembled with no previous knowledge, only the will to do it ;)

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates