Please ensure Javascript is enabled for purposes of website accessibility
Powered by Zoomin Software. For more details please contactZoomin

AVEVA™ XR Studio

Hand

  • Last UpdatedOct 22, 2025
  • 7 minute read

The Hand node represents a hand in the scene environment. The hand presence and visualization is of high importance when running immersive VR applications on devices such Oculus or OpenVR and even in AR experiences.

The hand node provides an important set of functionalities, such as:

  • Support for different type of hands, device-related or generic.

  • Automatic link from Oculus and OpenVR devices matrix to hand matrix.

  • Free or position presets based finger management.

  • Exposed hand and index finger matrix.

  • Interaction with Rect2D nodes through ItemMonitor and Texture2D.

  • Collision with CMeshes

Platform support

This node is fully supported on XR-Windows platform.

It is partially supported on XR-Portable Windows, XR-Portable iOS, XR-Portable Android, and XR-P WASM platforms.

XR-WIN

XR-P-WIN

XR-P-IOS

XR-P-AND

XR-P-WASM

Full support

Partial support

Partial support

Partial support

Partial support

Full support

Limited support

Limited support

Limited support

Limited support

Hand types

There are five supported types of hand, which can be specified by the handType field.

Each type (except off) must correspond to a specific geometry model (mwxTemplate field) with a specific naming convention.

Type

Description

left

The model represents a left hand. Presets are relative to the left hand.

right

The model represents a right hand. Presets are relative to the right hand.

vive

The model represents an OpenVR device that is currently in use. Fingers are not supported because the device is a fixed geometry.

oculus

The model represents an OpenVR device that is currently in use. Fingers are not supported because the device is a fixed geometry.

off

There is no need for a model. The node represents hand presence without displaying anything.

Note: Users can provide custom geometry models for hands and devices, but they must reflect the joint and the geometry names used in the default models.

Hand position

When using a tracked device, such Oculus Touch or Vive controller, the hand is automatically linked to the device through the sensorMatrix field.

When using the hand in screen mode, the best option is to link the hand to the camera using the cameraSpace attribute. The position and orientation fields can be used to offset and to adapt placement to model-specific characteristics.

Finger posture

Each finger is managed through a set of four rotations (in radiants). The current hand posture is defined by an mvec4 field named fingersValues that consists of five rotation vectors from thumb to the little finger.

For the long fingers, the meaning of the four rotations are:

  • Angle between first finger bone and the palm.

  • Angle between middle finger bone and first finger bone.

  • Angle between last finger bone and middle finger bone.

  • Left-right finger abduction angle.

For the thumb, the meaning of the four rotations are:

  • Angle between thumb and palm.

  • Angle between middle finger bone and first finger bone.

  • Angle between last finger bone and middle finger bone.

  • Rotation of the thumb on its axes.

Managing finger posture

There are two ways to control the finger posture: by directly setting the posture angles or by using presets. You can mix both approaches and the last set posture set will take priority.

You can edit a preset by using the fingersValuesPresetDefintions dictionary to add new presets or modifying existing presets.

Fingers can react to a new posture in two ways: immediate or smooth, depending on the activation of the useFingersValuesFollower field.

When the follower is activated, posture values assigned to the fingersValuesFollower field are transfered to the fingersValues progressively with a delay based on the fingersValuesFollowerMul field value.

  • Direct posture set

You can change the hand finger posture directly by setting the fingersValues field or the fingersValuesFollower when the useFingersValuesFollower is set to true.

  • Preset based posture set

You can assign one of the available postures using the fingersValuesPreset field. Available presets include most of the valuable hand positions, such as open, close, point, victory, or hello.

It’s also possible to add, remove, or edit the available presets using the fingersValuesPresetDefintions dictionary.

Preset is automatically assigned to fingersValues or to fingersValuesFollower, depending on useFingersValuesFollower state.

Interacting with ItemMonitor

As the position of the indexTerminalMatrix does not correspond to the index finger tip, but to its base where there is the joint with next bone, the indexFingerTipOffset and getIndexFingerTipMatrix have been added to be able to better map the finger tip position and improve interaction with ItemMonitors.

The offset is automatically added to the indexTerminalMatrix when the hand interacts with the monitor; however, it's also possible to get the resulting finger tip matrix using the sfunction field getIndexFingerTipMatrix.

Colliding with Cmeshes

The Hand node is able to manage different types of collision detection with collision meshes (Cmesh).

The mode is set by the enableCollisionWithCMesh field value with the following options: none, index, hand, and methacarpus.

The value none disables collision detection, while methacarpus and index can be applied only to left and right hand types.

While collision is happening, the collisionWithCMesh value is set to true and and the fully qualified collision mesh name is set in the collisionWithCMeshName field.

Code example

This is a code example for the Hand node.

<Hand name="test_hand_r" mwxTemplate="user\hand_r.mwx" mwxTemplateTag="HAND_NAME" handType="right" cameraSpace="true" position="0. -0.1 0.35" orientation="0 0 0" fingersValuesPreset="point"/>

Hand fields

These are the fields for Hand node. Only the node-specific fields are indicated, not fields obtained by inheritance.

Field inheritance: NodeBase > Hand

Fields

Type

Use

Default value

Description

_debug

sbool

Optional

false

Shows debug object.

cameraSpace

sbool

Optional

false

Links hand to camera position. This is useful in screen mode.

collisionWithCMesh

sbool

Read Only

Internally calculated

Changes to true when a collision is detected.

collisionWithCMeshName

sstring

Read Only

Internally calculated

Contains the name of the currently colliding CMesh.

enableCollisionWithCMesh

senum

Optional

none

Enables activation and definition of the cmesh collisioning mechanism with the following options: none, hand, index, and methacarpus.

fingersValues

mvec4

Optional

0 0 0 0, 0 0 0 0, 0 0 0 0, 0 0 0 0, 0 0 0 0

fingersValuesFollower

mvec4

Optional

0 0 0 0, 0 0 0 0, 0 0 0 0, 0 0 0 0, 0 0 0 0

fingersValuesFollowerMul

sfloat

Optional

2

fingersValuesPreset

sstring

Optional

none

Applies one of the predefined hand postures defined in the fingersValuePresetDefinition. The value null can be set without setting the posture.

fingersValuesPresetDefinitions

dstring

Optional

{open=0 0 0 0,0 0 0 0,0 0 0 0,0 0 0 0,0 0 0 0} … {pinch=1 1 1 0,0 0 0 0,1.3 1.3 1.3 0,1.3 1.3 1.3 0,1.3 1.3 1.3 0}

This dictionary contains the definition of all hand posture presets. By default, it contains the values: open, close, point, ok, hello, victory, horns, finger, and pinch. User can add, modify, or remove entries from the dictionary.

getIndexFingerTipMatrix

smatrix

Read only

Internally calculated

Returns the matrix obtained applying the indexFingerTipOffset to the indexTerminalMatrix.

handType

senum

Optional

left

Defines the hand type as left, right, vive, oculus, or off.

indexFingerTipOffset

svec3

Optional

0 0 0.03

Offset relative to indexTerminalMatrix. Used to better map the index finger tip.

indexTerminalMatrix

smatrix

Read Only

Internally calculated

The matrix of the index finger tip, which can be used to pick in the finger direction.

master

sbool

Optional

true

When set to true, the hand positioning is internally calculated. Setting to false is used to represent hands that do not belong to the local user.

mwxTemplate

sstring

Mandatory

No default

Then name of the model MWX to use. Its content must match the requirements of the selected handType.

mwxTemplateTag

sstring

Optional

HAND_NAME

Passes a parameter to the mwxTemplate to name the geometry accordingly.

orientation

svec3

Optional

0 0 0

Adds an orientation offset.

pointToTarget

sbool

Optional

false

When set to true, the hand automatically orients to point to target point.

position

svec3

Optional

0 0 0

Adds a position offset.

sensorMatrix

smatrix

Optional

Internally calculated

Sets the hand position when using devices such Oculus or OpenVR.

target

svec3

Optional

0 0 0

When pointToTarget is set to true, this is the scene point that the hand points to.

touchpadAxis

svec2

Optional

0 0

Used with OpenVR devices to set the touchpad position.

useFingersValuesFollower

sbool

Optional

false

When set to true, the finger posture change is smoothly animated using a follower.

visible

sbool

Optional

true

Hides/shows the hand.

worldMatrix

smatrix

Optional

Internally calculated (if master is set to true)

The current hand matrix. When the master field is set to false, the world matrix can be controlled from outside. This is used for showing remote users hands.

In This Topic
TitleResults for “How to create a CRG?”Also Available in