Difference between revisions of "User Manual/5. Capturing"

From Chordata documentation
Jump to: navigation, search
(Created page with "The time has finally come! wasn’t building your own mocap system lots of fun? Immagine how much fun it will be to use it! Let’s get to it. {{wip}} ==Capture using the Ble...")
 
(Receive, export and retransmit)
Line 1: Line 1:
 
The time has finally come! wasn’t building your own mocap system lots of fun? Immagine how much fun it will be to use it! Let’s get to it.
 
The time has finally come! wasn’t building your own mocap system lots of fun? Immagine how much fun it will be to use it! Let’s get to it.
{{wip}}
 
  
 
==Capture using the Blender add-on==
 
==Capture using the Blender add-on==
 +
{{note | In a near future, once the Utility software is up and running you will be able to start the capture directly from the add-on.
 +
 +
In the meantime you have to access to the SBC's terminal environment either normally through SSH, and start the capture manually. See [[SSH | this page]] for information on how to do it.
 +
And follow the [[Roadmap]] to check the state of the development.  |
 +
note }}
 +
[[File:DefBypedConf-IDModules.png | thumb | Default biped configuration]]
 +
 +
Make sure you placed the sensors accordingly to the [[Default Biped Configuration]]. Then run the notochord with these commands:
 +
 +
<syntaxhighlight lang="bash">
 +
cd bin
 +
./notochord -c ../default_biped.xml <YOUR COMPUTER'S IP ADDRESS>
 +
</syntaxhighlight>
 +
 
===Startup and In-pose calibration===
 
===Startup and In-pose calibration===
 +
In order to receive the data you should have an armature that matches the distribution of sensors on the performer. You can download a model for the Default biped on our {{Downloadlink |downloads page}}.
 +
 +
Load the file on Blender and make sure the "Armature Object" field on the Chordata add-on is set the to armature "Chordata".
 +
 +
{{note | The current in-pose calibration algorithm doesn't implement an overall heading offset, as a consequence the performer has to stand looking at a particular cardinal direction while calibrating (matching the direction the 3D model is pointing).
 +
This feature will be implemented on future versions, see the [[Roadmap]] to check the state of the development. }}
 +
 +
1.Ask your performer to '''look towards South''' and stand on '''T-pose''', help him or her to adjust the pose in order to best match the one on the figure.
 +
 +
2.Click the <code>Receive</code> button.
 +
:: You will enter ''Calibration Mode'', the model will be hidden, and just the helper objects will move. Your performer should stay still during this phase. After a few seconds press ENTER to continue.
 +
 +
3.You are now receiving the capture! the 3D model should follow your performer's movements.
  
 
===Record the capture===
 
===Record the capture===
 +
While you are receiving a capture you can record it as a Blender action.
 +
On the ''recording panel'' of the add-on just press <code>Record capture</code>, you should see the keyframes popping up on the timeline.
 +
To stop it just press the button again.
 +
 +
Every time you start a new recording the add-on will create a new Blender action for it, and store the previous one with a fake user. This means that they wont when you save your <code>.blend</code> file. In order to retrieve the stored actions you can select them on [https://docs.blender.org/manual/en/latest/editors/dope_sheet/action.html Blender's Action editor].
  
 
===Export the capture===
 
===Export the capture===
 +
Once you finished a capture session you can export the saved actions on many formats (like FBX, BVH, GLTF) using Blender's export feature. See [https://docs.blender.org/manual/en/latest/data_system/files/import_export.html here] for more info.
  
 
===Retransmit the data===
 
===Retransmit the data===
 +
Capture data can be also forwarded live, to use on another application on real time. Follow these steps
 +
 +
'''From the retransmission panel:'''
 +
:*Set desired method (normally Unicast), and destination IP and port
 +
:*Set the properties you want to transmit (by default just rotation)
 +
 +
'''From the advanced panel:'''
 +
:*Mark the bones you want to send as a Transmit bone (this section is only visible when the armature is in POSE MODE)
 +
 +
'''From the recording panel:'''
 +
:*Hit <code>Play and Transmit animation</code>
 +
 +
'''You will get an OSC stream with the address:'''
  
====Configure the retransmision====
+
<code>/Chordata/<property>/<name_of_the_bone></code>
  
 
===Use python to handle the data inside Blender===
 
===Use python to handle the data inside Blender===
  
 
==Implement your own client==
 
==Implement your own client==

Revision as of 10:26, 17 May 2019

The time has finally come! wasn’t building your own mocap system lots of fun? Immagine how much fun it will be to use it! Let’s get to it.

Capture using the Blender add-on

In a near future, once the Utility software is up and running you will be able to start the capture directly from the add-on.

In the meantime you have to access to the SBC's terminal environment either normally through SSH, and start the capture manually. See this page for information on how to do it.

And follow the Roadmap to check the state of the development.
Default biped configuration

Make sure you placed the sensors accordingly to the Default Biped Configuration. Then run the notochord with these commands:

cd bin
./notochord -c ../default_biped.xml <YOUR COMPUTER'S IP ADDRESS>

Startup and In-pose calibration

In order to receive the data you should have an armature that matches the distribution of sensors on the performer. You can download a model for the Default biped on our downloads page.

Load the file on Blender and make sure the "Armature Object" field on the Chordata add-on is set the to armature "Chordata".

The current in-pose calibration algorithm doesn't implement an overall heading offset, as a consequence the performer has to stand looking at a particular cardinal direction while calibrating (matching the direction the 3D model is pointing). This feature will be implemented on future versions, see the Roadmap to check the state of the development.


1.Ask your performer to look towards South and stand on T-pose, help him or her to adjust the pose in order to best match the one on the figure.

2.Click the Receive button.

You will enter Calibration Mode, the model will be hidden, and just the helper objects will move. Your performer should stay still during this phase. After a few seconds press ENTER to continue.

3.You are now receiving the capture! the 3D model should follow your performer's movements.

Record the capture

While you are receiving a capture you can record it as a Blender action. On the recording panel of the add-on just press Record capture, you should see the keyframes popping up on the timeline. To stop it just press the button again.

Every time you start a new recording the add-on will create a new Blender action for it, and store the previous one with a fake user. This means that they wont when you save your .blend file. In order to retrieve the stored actions you can select them on Blender's Action editor.

Export the capture

Once you finished a capture session you can export the saved actions on many formats (like FBX, BVH, GLTF) using Blender's export feature. See here for more info.

Retransmit the data

Capture data can be also forwarded live, to use on another application on real time. Follow these steps

From the retransmission panel:

  • Set desired method (normally Unicast), and destination IP and port
  • Set the properties you want to transmit (by default just rotation)

From the advanced panel:

  • Mark the bones you want to send as a Transmit bone (this section is only visible when the armature is in POSE MODE)

From the recording panel:

  • Hit Play and Transmit animation

You will get an OSC stream with the address:

/Chordata/<property>/<name_of_the_bone>

Use python to handle the data inside Blender

Implement your own client