User Manual/5. Capturing

From Chordata documentation
Jump to: navigation, search

The time has finally come! wasn’t building your own mocap system lots of fun? Immagine how much fun it will be to use it! Let’s get to it.

Capture using the Blender add-on[edit]

In a near future, once the Utility software is up and running you will be able to start the capture directly from the add-on.

In the meantime you have to access to the SBC's terminal environment either normally through SSH, and start the capture manually. See this page for information on how to do it.

And follow the Roadmap to check the state of the development.
Default biped configuration

Make sure you placed the sensors accordingly to the Default Biped Configuration. Then run the notochord with these commands:

cd bin
./notochord -c ../default_biped.xml <YOUR COMPUTER'S IP ADDRESS>

Startup and In-pose calibration[edit]

In order to receive the data you should have an armature that matches the distribution of sensors on the performer. You can download a model for the Default biped on our downloads page.

Load the file on Blender and make sure the "Armature Object" field on the Chordata add-on is set the to armature "Chordata".

The current in-pose calibration algorithm doesn't implement an overall heading offset, as a consequence the performer has to stand looking at a particular cardinal direction while calibrating (matching the direction the 3D model is pointing). This feature will be implemented on future versions, see the Roadmap to check the state of the development.

1.Ask your performer to look towards South and stand on T-pose, help him or her to adjust the pose in order to best match the one on the figure.

2.Click the Receive button.

You will enter Calibration Mode, the model will be hidden, and just the helper objects will move. Your performer should stay still during this phase. After a few seconds press ENTER to continue.

3.You are now receiving the capture! the 3D model should follow your performer's movements.

Record the capture[edit]

While you are receiving a capture you can record it as a Blender action. On the recording panel of the add-on just press Record capture, you should see the keyframes popping up on the timeline. To stop it just press the button again.

Every time you start a new recording the add-on will create a new Blender action for it, and store the previous one with a fake user. This means that they wont be lost when you save your .blend file. In order to retrieve the stored actions you can select them on Blender's Action editor.

Export the capture[edit]

Once you finished a capture session you can export the saved actions on many formats (like FBX, BVH, GLTF, etc) using Blender's export feature. See here for more info.

Retransmit the data[edit]

Capture data can be also forwarded live to be used by another application on real time. Follow these steps:

From the retransmission panel:

  • Set desired method (normally Unicast), and destination IP and port
  • Set the properties you want to transmit (by default just rotation)

From the advanced panel:

  • Mark the bones you want to send as a Transmit bone (this section is only visible when the armature is in POSE MODE)

From the recording panel:

  • Hit Play and Transmit animation

You will get an OSC stream with the address:


Use python to handle the data inside Blender[edit]

The idea is to provide a defined mechanism to eventually modify the way the capture is handled by defining a couple of methods. See this post on blender's devtalk.

This is still a not implemented feature. you can -of course- modify the way the add-on works by tweaking the source.

The reason for this to be freeze is that we couln't imagine an use case. If you would benefit from such a feature please let us know in the forum.

Implement your own client[edit]

At the core of the Chordata framework there are well defined protocols to control the remote capture services and share mocap data.

The Blender add-on is the first and main implementation of a Chordata client, but creating a different one shouldn't be difficult provided you use the right tools. There's currently no explicit documentation on how to build a client, you can find detailed information on how the system is structured, and the communication mechanisms between its services on the How it works section of this wiki.

If you have intention to implement a client we'll be happy to assist you through our forum.