3. Data Casting

SDK Broadcasting Data (6.3.1)

  1. Click on 'Data Casting' at the top of the software interface. The default network card address for XINGYING software is '10.1.1.198'. The IP address can also be changed in the drop-down box.

  2. XINGYING software supports SDK secondary development, allowing data broadcasting for usage on the same machine or other computers within the same IP segment. The IP address can be changed live through the drop-down box in 'Network Card Address'. The set network card address is the address from which the server sends data. To receive data from motion capture, the client computer must remain in the same network segment as the server.

  3. The software supports dynamic IP acquisition, eliminating the need to restart. It can work in tandem with C++/C# and other plugins to acquire motion capture data in real time. Please consult our technical engineers to obtain the plugin version.

  4. SDK Streaming: The 'SDK Enable' option is not selected by default. This feature can be selected when the camera is not connected or is paused. After selecting SDK, the software will save this configuration, and the SDK will be selected by default when the software is subsequently started. After enabling SDK, motion capture data will be broadcast externally through the set network card IP address using the SDK protocol. The software supports 'Unicast' and 'Multicast' modes below, with the default setting being 'Multicast' mode.

  5. After selecting the SDK option, the lower left corner of the 3D view in Live mode can display the delay time and unit in real-time (6.3.2). If the SDK function is turned off, the delay time information will not be displayed. The SDK delay time will not be displayed in real-time in Edit Mode.

  6. Skeleton Coordinates: The default skeleton coordinates are set to "Global" (Section 6.3.1), and can also be set to either "Global" or "Local" from a dropdown menu. After setting the skeleton coordinates, this setting is saved in the configuration, and the next time XINGYING is launched, the type of skeleton coordinates set previously will be displayed:

    • Global: The motion capture data sent by the SDK to the outside is in the form of global skeleton coordinates, and the skeleton data obtained by the client is global skeleton coordinates data.

    • Local: The motion capture data sent by the SDK to the outside is in the form of local skeleton coordinates, and the skeleton data obtained by the client is local skeleton coordinates data.

    • If the SDK sends data related to rigid bodies, the type of skeleton data broadcasted externally, whether global or local, is consistent and indistinguishable. If the data sent is human-type, there would be some differences in the skeleton data broadcasted externally between global and local skeleton coordinates;

    • To change the type of skeleton coordinates, the "SDK" must be shut down before any modifications can be made; otherwise, the skeleton coordinates function will be greyed out and unchangeable.


VRPN Streaming (6.3.3)

  1. Please pause the software or enable VRPN in 'Data Casting' when the camera is not connected before turning on VRPN. After enabling VRPN, motion capture data will be broadcasted externally through the set network card IP address using the VRPN protocol.

  2. VRPN can be divided into three types: 'Rigid', 'Marker', and 'Marker (Unnamed)'. Both Live mode and Edit Mode can use VRPN to transmit data. Before using VRPN, you can use our 'NOKOVVrpnClient.exe' test tool to verify whether VRPN data can be obtained. You can also use open-source code or tools to obtain the VRPN data in our motion capture. Below is a brief introduction on how to use the NOKOVVrpnClient test tool to obtain different types of data. (Please consult the technical engineer if you need this test tool.)

  3. Checking the 'Rigid' type indicates sending information data about the rigid body. The naming method in the test tool is 'Rigid Body Name@10.1.1.198'. Check 'Rigid' and 'VRPN Enable', and after checking, click play to put the camera into play mode. Use the cmd command to enter the path where the NOKOVVrpnClient.exe file is located in the terminal, input the command '.NOKOVVrpnClient.exe Tracker0@10.1.1.198' and press enter (6.3.4). You can obtain the rigid body data in motion capture, where 'Tracker0' corresponds to the name of the rigid body in the motion capture software.

  4. The naming method for the human body is 'MarkerSet Name_Skeleton Name'. For example, to get the information of the human head skeleton, input the command '.NOKOVVrpnClient.exe Body0_SHead@10.1.1.198' and press enter, you can then obtain the head skeleton data of the human body named Body0. 'SHead' represents the name of the human head skeleton. If you want to get information on other human skeletons, you only need to replace 'SHead' with the corresponding skeleton name.

  5. The 'Marker' type indicates the transmission of data about named Marker points. The naming method in the test tool is 'Rigid Body Name_Marker Name @10.1.1.198'. After selecting 'Marker' and 'VRPN', click 'play' to put the camera into play mode. Use the command prompt to navigate to the directory where the NOKOVVrpnClient.exe file is located. For example, to get the data of the point 'Marker1' belonging to the rigid body 'Tracker0', you can run the command '.NOKOVVrpnClient.exe Tracker0_Marker1@10.1.1.198' and press enter (6.3.5), then you can get the information data of the point 'Marker1', where 'Tracker0' corresponds to the name of the rigid body in the software, and 'Marker1' represents the name of the point belonging to the rigid body 'Tracker0'. The specific names of the points of the rigid body can be viewed in the Assets - Component - Markers list.

  6. The 'Marker(Unnamed)' type refers to the transmission of data about unnamed Marker points. The naming method is 'U_Tracker+Unnamed Marker Index (starting from 0) @10.1.1.198'. Select 'Marker(Unnamed)' and 'VRPN Enable', then click 'play' to put the camera into play mode. Use the cmd command to enter the path where the NOKOVVrpnClient.exe file is located in the terminal, input the command '.NOKOVVrpnClient.exe U_Tracker0@10.1.1.198' and press enter (6.3.6), you can then obtain the information data of the unnamed Marker points. Here, 'U_Tarcker0' represents the first unnamed point. If you want to get the information data of the second unnamed point, change 'U_Tarcker0' to 'U_Tracker1', which represents the second unnamed point.

  7. The 'Unit' option box allows you to select different units of measurement. You can choose from 'millimeter', 'centimeter', and 'meter'. After changing the unit, the data in VRPN will adjust accordingly.

  8. In 'Invert', checking the x, y, z coordinates will reverse the sign of the variables selected in 'Type'. qx, qy, qz' represent the rotational data of the rigid body. After selecting 'Rigid Body' and running '.NOKOVVrpnClient.exe Tracker0@10.1.1.198', 'quat' will display the rotational data of the rigid body 'Tracker0'. After selecting 'qx, qy, qz', the coordinate values of 'quat' will be inverted. The 'quat' value only displays the rotation of the rigid body and does not display the rotation information of variables 'Marker' and 'Marker(Unnamed)' .

  9. In 'Offset', you can set the 'x, y, z' coordinate offsets for the data variables. After adjusting the offset values, VRPN will add these modified values to the original coordinates.

  10. When 'Velocity' and 'Acceleration' are checked, they will output corresponding data in VRPN. 'Frames' is used for speed calculation; adjusting the frame factor can align the actual speed value with the output speed value.

network streaming:

  1. Compatible with the Xsens MVN protocol, can be used with software that supports the Xsens MVN protocol. Xingying data can be sent to Dassault Systemes' Delmia and Catia software through Haption software to drive human models and props. Data can also be sent to BOB software to drive human models.

Last updated