Skip to content

Hand controller/s component #14

Open
@Truemedia

Description

@Truemedia

Some XR headsets on the market offer support for hand gestures/hand tracking as a form of control interface, either as a primary or secondary of driving/interacting in mixed reality.

Currently well documented examples include the Apple vision pro which solely utilises hand gestures for the vast majority of operation, and the Meta quest 3 which allows hand tracking as an optional method for users to navigate operating system user interfaces, as-well as a subset of applications which support hand tracking as a secondary/supplementary form of interaction (support for this feature has to be explicitly coded by the developer).

I propose we create a set of components specifically for developers to enable/utilise hand tracking and gestures capabilities within their Tres XR applications. This would includes the following components:

  • HandController.vue
    A component that enables the tracking/eventing of one hand, primary prop would specify which hand (left or right) that the component instance corresponds to.

  • HandControllers.vue
    A parent component which encapsulates a pair of hand controller components to represent both a left hand and a right hand.

The components should comply with and make use of the WebXR hand input module spec https://www.w3.org/TR/webxr-hand-input-1/

Activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

Labels

enhancementNew feature or requestp3-significantHigh-priority enhancement (priority)

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions