Git Product home page Git Product logo

unity-movement's Introduction

Unity-Movement

Unity-Movement is a package that uses OpenXR’s tracking layer APIs to expose Meta Quest Pro’s Body Tracking (BT), Eye Tracking (ET), and Face Tracking (FT) capabilities. With this package, developers can leverage tracking to populate VR environments with custom avatars that bring the expressiveness of users into the virtual environments that they create.

License

The Unity-Movement package is released under the Oculus License. The MIT License applies to only certain, clearly marked documents. If an individual file does not indicate which license it is subject to, then the Oculus License applies.

Requirements

Getting Started

First, ensure that all of the requirements are met.

Then, bring this package into the project.

The sample scenes are located under the Samples/Scenes folder.

Unity Setup

If the new scene or an existing scene doesn’t have a GameObject with the OVRCameraRig component, follow the steps:

  1. From the Hierarchy tab, look for a Main Camera GameObject.
  2. If the Main Camera GameObject is present, right-click Main Camera and click Delete.
  3. Using the top level file menus, navigate to Oculus->Tools->Building Blocks. Select the (+) icon on the lower right of the Camera Rig option.
  4. Select the Camera Rig object in the Hierarchy, and in the Inspector tab, go to OVR Manager > Quest Features.
  5. In the General tab, there are options to enable body, face, and eye tracking support. Select Supported or Required for the type of tracking support you wish to add.
  6. Under OVRManager's "Permission Requests On Startup" section, enable Body, Face and Eye Tracking.
  7. Ensure that OVRManager's "Tracking Origin Type" is set to "Floor Level".
  8. In OVRManager's "Movement Tracking" section select "High" for "Body Tracking Fidelity."
  9. In OVRManager's "Movement Tracking" section select "Full Body" for "Body Tracking Joint Set."

Layer index 10, layer index 11, and the HiddenMesh layer must be present in the project for RecalculateNormals to work correctly.

Some Project Settings can be validated via Movement->Check Project Settings. For a more thorough check, please use Oculus->Tools->Project Setup Tool.

Rendering Quality

Navigate to your Project Settings (Edit->Project Settings...) and click on the "Quality" section. If your project uses URP, then some of these settings might be part the rendering pipeline asset currently in use. The pipeline picked will be shown in the Quality menu.

The following settings are recommended:

  1. Four bones for Skin Weights.
  2. 2x or 4x Multi Sampling Anti Aliasing.
  3. Full resolution textures.
  4. Shadow settings:
    • Hard and soft shadows.
    • Very high shadow resolution.
    • Stable fit.
    • Shadow distance of 3 meters with cascades. This will allow viewing shadows nearby without experiencing poor quality.
  5. At least one pixel light.

Samples

The project contains several sample scenes. To test the samples, they must be imported into the project's Assets folder:

  • Select the "Meta Movement" package in the package manager. Once selected, expand the Samples section and import the desired sample scenes.

For more information about body tracking, please refer to this page.

For more information about the samples, please refer to this page.

Player Settings

Make sure that the color space is set to Linear.

Build Settings

In order for the SceneSelectMenu buttons to work, add imported scenes in the Samples step to the Build Settings.

Documentation

The documentation for this package can be found here. The API reference for this package can be found here.

License

Unity-Movement is subject to the Oculus SDK License Agreement, as found in the LICENSE file.

unity-movement's People

Contributors

actions-user avatar andkim-meta avatar facebook-github-bot avatar mvaganov-meta avatar semantic-release-bot avatar sohailshafii avatar sohailshafiiwk avatar tseglevskiy avatar vyzant avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

unity-movement's Issues

Missing textures in repo

Would it be possible to get the teeth, tongue and eyelash textures for Aura? I don't currently see them in the repo. Thanks!

Also the high fidelity model doesn't have a proper face texture and the neutral AO file appears to be corrupt. Could we get updated version of those?

How can I accurately calculate the annotated position of the eyes?

Hello, I'm working on eye-based interaction with the interface, but I've recently encountered a problem.
I'm trying to calculate the annotated position of the eyes on a plane and add a cursor. In my script, I use the avatar's lefteye component to initialize the transform, and based on the position and forward direction of this transform, I perform collision detection with the plane.
However, it's wired there is an offset from the target I'm annotating.
image
image
In the picture above, I'm actually looking at the red ball, but the rendered cursor (the small red dot) has an offset.
Can u please advise on how to resolve this issue? Thanks!

InputSubsystem Not Found

I receive a "InputSubsystem Not Found" error when trying to play any of the scenes under Meta Movement => Samples => Scenes.

Steps to reproduce:

  1. Created new Unity 2021.3.27f1 project
  2. Imported Oculus Integration SDK version 54.1
  3. Configured all settings indicated in: https://developer.oculus.com/documentation/unity/unity-conf-settings/
  4. Imported Movement SDK
  5. Tried to play cube scene. It works great.
  6. Tried to play movement samples scenes. Scenes render to headset, but character does not move with my controller, head, or face movements.

I am using an Oculus Quest 2 connected to a Windows machine using Link

Note: During step #2, I selected "Yes" when prompted: "This project is using the new input system package but the native platform backends for the new input system are not enabled in the player settings. This means that no input from native devices will come through. do you want to enable the backends?"

Thank you for your assistance!

Sample scenes InvalidOperationException: Avatar is null.

When I enter play mode with for example the MovementHighFidelity sample scene, I see the below error:

2024-01-03_18-02-08

Tracking seems to be off for multiple bones. There is an offset between the OVRCameraRig hand position and the HighFidelityFirstPerson hand mesh position (about 30cm or so) and also other bones of the avatar body seem to track incorrectly resulting in a stretched body.

Another issue, when I smile with my mouth closed, the avatar smiles with it's mouth open.

Unity 2022.3.11f1, Meta all-in-one 60.0.0 and Meta movement 4.0.1.

[OVREyeGaze] Failed to start eye tracking.

Hello fellow developers :)

Been trying to get this working over oculus air link, but with no success yet.

Both Oculus Software/firmware have been freshly installed / updated.
Using the latest Oculus SDK.

Verified all older posts and solutions regarding this warning, but nothing helped.

image

image

image

image

image

image

Oculus_Logs_DESKTOP-0BSF5QS_20231225_122821.zip

Hope all the above information helps, and thank you so much for taking the time!

anyone, can help me? T.T

Has anyone converted this character to be available on vrchat?

제목 없음

If anyone has already done it, can I get the Unity package file? I will pay for it.

Sample (3.1.1) not working with new Meta XR SDKs from Package Manager (59.0.0)

Environment

Unity 2022.3.10f1
Oculus Integration (deprecated) 56 from Asset Store
Movement Sample was imported by downloading the zip and adding it as a custom package in the project (v 3.1.1)

This setup worked fine.

Migration to new Meta Packages

Followed the steps for migration from here.

Deleted everything Oculus and OVR and installed the following packages:

  • com.meta.xr.sdk.core
  • com.meta.xr.simulator
  • com.meta.xr.sdk.platform
  • com.meta.xr.sdk.audio
  • com.meta.xr.sdk.interaction
  • com.meta.xr.sdk.voice
    All packages are installed as version 59.0.0.

This resulted in a compile error where the AnimatorBoneVisualizerBoneTupleDrawer can not find the OVRSkeletonBoneVisualizerBoneTupleDrawer.

Console output:

Packages\Unity-Movement-3.1.1\Editor\Utils\AnimatorBoneVisualizerBoneTupleDrawer.cs(20,13): error CS0103: The name 'OVRSkeletonBoneVisualizerBoneTupleDrawer' does not exist in the current context

Seems like some of the sample utilities have a direct dependency to some utilities in the Oculus Integration, and migration becomes impossible.

Issue with Full Body Tracking on Custom Avatar in Unity 2022.3.16f1

Hello,

I'm currently working with Unity version 2022.3.16f1 and Movement version 4.0.1, and I've encountered an issue related to full body tracking on a custom avatar, initially designed in Blender. Following the guidelines provided in the documentation (link), I've imported the avatar into Unity as an FBX file.

After setting the animation type to Humanoid and enabling Translation DoF for the avatar configuration, everything appeared normal under the "Muscles & Settings" options, with all avatar parts moving correctly.

The problem arises when implementing full body tracking. Using my Quest Pro, I observed that the arm movements of the avatar are not tracking accurately. This issue persists even after testing with Unity's standard Robot Kyle avatar, suggesting that the problem may not be exclusive to my custom model.

I am including a video clip demonstrating the specific tracking issue, along with the avatar file I've been using. Any insights or suggestions to resolve this tracking issue would be greatly appreciated.

Thank you for your assistance.

Avatar.zip

bug.mp4

Few issues with New Project Setup (Face, Eye, And Body)

Guys great work on this project I am having a lot of fun !

Few things which I believe to be problems with new projects:

Looks like there are a couple of layers not included when you create a new project and import the Movement SDK.

These layers currently need to be created manually:

  • HiddenMesh
  • Character
  • MirroredCharacter

Now I've Body, Face, And Eye tracking enabled in the OVR Camera Rig also Android Manifest has the proper permissions. Oculus app also has all the settings to be able to run this project with Oculus Link as shown below:

image

Now I think body tracking may be missing from above? or is this error something different?

[OVRPlugin] [XRCMD][failure] [XR_ERROR_RUNTIME_FAILURE]: xrLocateBodyJointsFB(m_xrBodyTracker, &info, &locations), arvr\projects\integrations\ovrplugin\src\util\compositoropenxr.cpp:5217
UnityEngine.Debug:LogWarning (object)
OVRManager:OVRPluginLogCallback (OVRPlugin/LogLevel,intptr,int) (at Assets/Oculus/VR/Scripts/OVRManager.cs:1628)

The result is that I can do eye tracking and face tracking successfully but body tracking is not working at all, I have a feeling this is due to not having body tracking available through Oculus Link ? I can try to run this right after from the device to see if it works and report back with my results.

Thanks again for all you do !
Dilmer

iobt lower body tracking question

When performing iobt retargeting, is it possible to measure the user's height again and control avatar retargeting with a component?

For example, if a user sits down and takes off the VR device, full tracking of the avatar stops, but if the user stands again and puts the VR device on, the lower body will remain in a sitting state. Therefore, what I want to do is when the user puts the device on again, I recalculate the user's height and try avatar retargeting again.

Is it possible to use the proofreading API as shown on this site? If you know how to use it, please reply.
==>
https://developer.oculus.com/documentation/unity/move-body-tracking/#calibration-api

pose detection & tracking

First of all, I'm very interested in this. So I am reviewing the code, but it is not quiet easy. I want to know the primary script where each pose detection and tracking is happening because I am trying to learn the logic of pose detection and tracking
Can you please explain what scripts are operating for those?
\Thanks!

Fail to start eye tracking.

Hello Oculus Developers, I hope this message finds you well.
I am currently facing an issue in Unity 2020.3.40fc1 while trying to use the eye-tracking technique by the up-to-date Oculus Integration.
I kept getting this warning issue and cannot use the OVR Eye gaze script using Quest pro. Could you plz help me?
(The permission of eye-tracking has been opened in the Oculus app and device. The warning is as follows:

[OVREyeGaze] Failed to start eye tracking.
UnityEngine.Debug:LogWarning (object)
OVREyeGaze:StartEyeTracking () (at Assets/Oculus/VR/Scripts/OVREyeGaze.cs:136)
OVREyeGaze:OnEnable () (at Assets/Oculus/VR/Scripts/OVREyeGaze.cs:110)
UnityEngine.GUIUtility: ProcessEvent (int,intptr,bool&)

Since the old integration has been deprecated, I tried to add the new Meta XR SDK Core package in a new project of Unity 2022.3.7f1c1. But I am receiving the following error:

[Package Manager Window] Error adding package: [email protected].
undefined == true
UnityEditor.EditorApplication:Internal_CallUpdateFunctions ()

IOBT still uses inverse kinematics?

In non-IOBT mode ('low body tracking fidelty', 'inverse kinematics') if I hold my arm at 90 degrees and wave, my elbow also moves significantly. This is expected, since IK that is based purely on hands cannot know if I am moving just the wrist or my entire arm.

However, in IOBT mode ('high body tracking fidelty') I see a similar (albeit reduced) effect. This happens even if my elbow is in clear view of the cameras.

com.pozear.PozeAR-20240302-094149.mp4

It's like IOBT is not actually tracking my elbow, and still doing IK based on my hand/wrist. Is this expected behaviour?

Issues with retargeting

I have setup my avatar according to documentation and all looks correct in the mapping -
image

Correct scripts have been added to character -
image

But the animation is completely out. Head is too high and not moving in sync with upper chest and shoulders, and hands are not rotating at the correct place. Wrist rotation seems to be happening higher up the lower arm.
image

custom avatar retargeting error with iobt(full body tracking)

Movie_007.mp4

=>Unity PlayMode

image ==>original avatar fbx

I want to do full body tracking using iobt. If you install the meta movement SDK and view the sample scene, the avatar will appear properly. However, if I take the avatar fbx file I use and use it, the avatar comes out distorted like that. Is there a way to automatically and accurately perform avatar retargeting?

i use Meta Quest Pro
unity 2023.3.17f1
please help me!!!!

Various Type Name does not exist in OVR*

Hi, I downloaded (https://assetstore.unity.com/packages/tools/integration/meta-xr-all-in-one-sdk-269657) following the entire procedure and managed to install it correctly and test it.
The problem arises when I try to use (https://github.com/oculus-samples/Unity-Movement) because, at the end of the download and installation on the Unity package manager, I get several error messages:
"Library\PackageCache\com.meta.movement@269e4a2\Runtime\Scripts\AnimationRigging\Legacy\FullBodyCustomMappings.cs(112,43): error CS0117: 'OVRPlugin.BoneId' does not contain a definition for 'FullBody_End'"
[..]
"Library\PackageCache\com.meta.movement@269e4a2\Runtime\Scripts\AnimationRigging\Legacy\FullBodyCustomMappings.cs(87,61): error CS0117: 'OVRPlugin.BoneId' does not contain a definition for 'FullBody_RightHandRingDistal' "

Version sdk: 59.0.0
Version Movement: 4.0.0
What did I do wrong?

Replace default avatar with other

Hello,
I want to replace the default avatars with others (like avatars from Character Creator 4). Despite several attempts, I can't reproduce the test scene (MovementHipPinning).

  • Automatic mapping (setup character for body tracking / format: unity Humanoid) gives me correct hand tracking, but the avatar is completely deformed (mainly on the arms).
  • And the oculus skeleton format completely distorts my character when I setup the corresponding bones.

What avatar source do you use (CC4D, Mixamo, ...)?
Are there any specific parameters to take into account during generation?

Thank you.

Hand tracking not working in unity editor using link

Im using currently, Windows 11, Quest pro, quest 2 and 3
i have tried in a new project in unity 2022.3.7f1
with the newest Meta all in one SDK, version 60
the project is on a private git repo, so me and a friend have both access to it
My settings on the oculus pc app is identical to his settings, my headsets are also in developer mode
but for my friend their hands work, and mine dont, the only difference is that he is using windows 10 and im using windows 11

these are the logs when hitting play. i have set the ovr rig building block and the had tracking building block and made sure hands fields are set as left and right respectivelly, in Quest features i have set hands only
image

How to get VR user's skeleton information ?

If Body Tracking Support is enabled, the OVRBody script will poll for updated body data and can be used to drive an animated skeleton or to provide information on positioning for game play. But each user 's skeletal information like body height is different. How does the OVRBody script get user's skeletal information?

Face expression and eye gaze data

Hello Oculus Developers, I'm wonder that we can get the raw data of facial blendshape and eye gaze data.
I'm preparing some blendshape works so if there is anybody who knows the way to get these data please let me know.

Hip pinning Question

Hey, I just noticed that the hip pinning isn't quite accurate in the first person in the "MovementHipPinning" scene, is there a setting or something that I've missed? Instead of being pinned, the torso follows the user almost straight down but the mirrored version works as intended. I've added a photo of what I'm looking at, the first person essentially moves off the chair whereas the mirrored stays put. Hopefully it's just a button I've missed haha. Cheers

Unable to start sample-scenes

Hello! After adding the sample-scenes from the package to the project we'll get this error:

Library\PackageCache\com.meta.movement@218e21a\Runtime\Scripts\AnimationRigging\RetargetingLayer.cs(329,31): error CS1061: 'Transform' does not contain a definition for 'SetLocalPositionAndRotation' and no accessible extension method 'SetLocalPositionAndRotation' accepting a first argument of type 'Transform' could be found (are you missing a using directive or an assembly reference?)

Do you know what to do with this error?

Meta Avatar Integration

Wondering if their are any plans to implement the Movement SDK with Meta Avatars or if someone has been able to get them working together?

I tried a few different things and they don't seem compatible. Would definitely increase the use of Meta avatars if they worked with the Movement SDK.

Open sample scenes in editor?

Hello,

I would like to view the sample scenes inside the Unity editor, when I try to open one it says it's in a read-only package?

image

for example I am trying to access the "smileeffect.cs" but the references seem missing. Not sure if its because of the samples was not properly connected to the read-only package?

image

Thanks

Invalid feet positions in HipPinning sample when the model is rotated.

Steps to reproduce:

  1. Open MovementHipPinning sample.
  2. Rotate Objects 90 degrees around Y axis.
    image
  3. Rotate OVRCameraRig 90 degrees around Y axis as well.
  4. Enable mesh and material for C_wetsuit_PLY under HighFidelityHipPinningFirstPerson as on screenshot below.
    image
  5. Run the sample.
  6. Feet are directed to the left.
    com itseez3d metaperson quest handtracking sample dev-20240215-120935

Assertion failure. Value was false.

Hi!
I have used Movement for a few days. Most of the time all the scenes worked well. But sometimes i got this error:
fe856a3d136b7e9f4826eda70ebd2cd
I try to find what's wrong, and it seems that the boneid is always Invalid. But the error can be eliminated by restarting Unity and Quest link.
My relevant settings:

  1. Unity 2021.3.23f1
  2. XR Plug-in: Oculus
  3. Oculus Integration 53.1

Could you plz help me?

oculus Build Error

I used 3.3.0 and 3.3.1 to build oculus app, unity build error:
Building Library\Bee\artifacts\Android\d8kzr\f57k_bly-CSharp.o failed with output:
E:...\Library\Bee\artifacts\Android\il2cppOutput\cpp\Assembly-CSharp.cpp(12849,3): error: no matching function for call to 'GeneratedWriters___Internal_Write___System_Collections_Generic_Dictionary_2U3CSystem_Int32U2CSystem_SingleU3EFishNet_Serializing_Generated_mC66156A37282AD027219EBAA953B521DC1B848F4'

but used 2.6.6 to build is succuss.

Error on `RetargetingLayer.cs` after installing package

I am struggling with installing Movement SDK...
After installing package using git, I got this error.

Library\PackageCache\com.meta.movement@f6654169ac\Runtime\Scripts\AnimationRigging\RetargetingLayer.cs(329,31): error CS1061: 'Transform' does not contain a definition for 'SetLocalPositionAndRotation' and no accessible extension method 'SetLocalPositionAndRotation' accepting a first argument of type 'Transform' could be found (are you missing a using directive or an assembly reference?)

I tried to (1) remove library cache and re-install, and (2) reinstalling with local file.
However, I got same error even when I make a new project.

This is my current relevant configuration.

  • Unity 2021.3.4f1
  • Oculus Integration 53.1
  • Windows 11
  • Build Setting -> Android,
  • XR Plug-in -> Oculus
  • Scripting Backend -> IL2CPP

Is there anything I could miss?

OVREyeGaze doesn't track per eye (cross eyed not possible)

When checking the MovementAura and MovementHighFidelity scenes, I noticed the mirrored avatars don't look cross eyed when I cross my eyes. I conclude from this that the OVREyeGaze script doesn't track per eye but instead gives back some averaged value. This contradicts to what is stated in the documentation:

https://developer.oculus.com/documentation/unity/move-overview/#eye-tracking

The abstracted eye gaze representation that the API provides (gaze state per eye) allows a user’s character representation to make eye contact with other users. This can significantly improve your users’ social presence.

Tested with Unity 2022.3.11f1, Meta All-in-One 60.0.0 and Meta Movement 4.0.1

Body Tracking PCVR

Hey, I'm trying to use the Movement SDK to get upper body tracking for a pinned hip from the first person. Some of it still needs work but I'd just like to know if the SDK is able to work on a PC build? I got it to work in editor using the Windows Build target but when I build it to an exe. nothing shows up except for the skybox, any ideas

Editor.mp4
Build.mp4

Unity Profiler - Body tracking

Hey Devs,

I'm just looking into how much CPU the body tracking takes up because it seems to be a bit too harsh to be able to add in an environment as well. Can you guys let me know if this lines up on your end or if there's any ways that it can be optimised? I've also turned off Vsync for these tests but just in case it didn't translate to the feature over I changed the FPS goal to 120. Otherwise Vsync would be forcing delay on the more efficient "Just VR camera" comparison.

Just VR camera - No environment - Meta package
FPS Goal = 120fps
Recorded FPS = 119.9FPS
CPU Delay = 7.94ms
GPU Delay = 4.17ms
EarlyUpdate.XRUpdate = 56.3% = 4.46ms delay
FrameEvenets.XRBeginFrame = 2.6% = 0.21ms delay

With Tracking - No environment - Meta package
FPS Goal = 120fps
Recorded FPS = 60.8FPS
CPU Delay = 14.84ms
GPU Delay = 8.80ms
EarlyUpdate.XRUpdate = 53.8% = 7.99ms delay
FrameEvenets.XRBeginFrame = 17.6% = 2.62ms delay

Also, is there any better ways that I could check that a delay is directly related to body tracking? There's the scripts and their associated numbers (e.g. OVRSkeleton.FixedUpdate() at 0.8% - 0.13ms delay) but I feel like that's not registering sensor data etc.

Cheers,
Ezekiel

Cannot open project

Download source via Git and upon trying to add the porject to Unity hub it says the project is not valid?

Tried in 2021.3.30 and 2022.3.7

Tried to also directly open a scene from within the folder and same issue? Do I need to change some thing?

Retargeting of hips rotation

Hello, I have two questions.

I'm using sample scene, retargeting, that you provided in Movement SDK.
As I controlled avatar(ArmatureSkinningUpdateRetargetUser) with left and right hand controller, I found out that even though I'm not moving my torso, the avatar's hip keeps rotating whenever I punch forward with left or right hand respectively. Is there any way to fix the hip rotation and make it not rotate?

Also, I'm smaller than Armature so even though I stretched my arms fully, the avatar's arms are slighly bent because of the difference of bone length between mine and the avatar's. Would there be any way to solve this problem? I was thinking of using other model that has similar length as mine but it seemed so burdensome so I hoped there's way to use the provided one.

Thank you very much.

Full body tracking

Hello,

is the recently announced update for full body tracking (including legs) already available in the recent com.meta.movement package version 3.1.1 ?

Thank you.

IOBT turns off when passthrough tuns on

I modified the MovementRetargeting sample to turn on passthrough after 10 seconds.

Before passthrough turns on, I can use adb shell dumpsys activity service com.oculus.bodyapiservice.BodyAPIService and see that "iobt" : true. But after it turns on, adb shell immediately returns "iobt" : false

Is this expected behaviour? Does IOBT not work when passthrough is turned on?

Lower body tracking(IK) not there

Is fullbody tracking supported with Movement SDK?

I run the sample scene "MovementRetargeting", but only upper body movements are tracked. Lower part always move along with
upper part. Seems like no IK behaviours on leg.

Nothing much explained in the documentation as well, just body tracking.

Could you please confirm this?.

Regards
Sabin M. George

Multiple Prefab Import Errors in Unity Package 'com.meta.movement'

Description

I'm encountering multiple errors when attempting to import prefabs from the 'com.meta.movement' Unity package. These errors suggest that the files might be corrupt or are missing Variant parent or nested Prefabs.

Steps to Reproduce

  1. Import the com.meta.movement package version 3.1.1 (2023-09-21) into Unity.
  2. Observe the errors in the console as Unity tries to import prefabs related to UI components and legacy menus.

Expected Behavior

Prefabs should import seamlessly, enabling the utilization of the sample UI and legacy menus provided by the package.

Actual Behavior

The import process results in multiple errors, preventing the use of the prefabs. The following errors are displayed:

  • [20:57:39] Problem detected while importing the Prefab file: 'Packages/com.meta.movement/Samples/Prefabs/UI/Legacy/RetargetingMenuLegacy.prefab'. The file might be corrupt or have a missing Variant parent or nested Prefabs.
  • [20:57:40] Problem detected while importing the Prefab file: 'Packages/com.meta.movement/Samples/Prefabs/UI/SceneSelectMenu.prefab'.
  • [20:57:41] Problem detected while importing the Prefab file: 'Packages/com.meta.movement/Samples/Prefabs/UI/TogglesMenu.prefab'.
  • [20:57:42] Problem detected while importing the Prefab file: 'Packages/com.meta.movement/Samples/Prefabs/UI/HipPinningMenu.prefab'.
  • [20:57:43] Problem detected while importing the Prefab file: 'Packages/com.meta.movement/Samples/Prefabs/UI/Legacy/SceneSelectMenuLegacy.prefab'.
  • [20:57:43] Problem detected while importing the Prefab file: 'Packages/com.meta.movement/Samples/Prefabs/UI/Legacy/HipPinningMenuLegacy.prefab'.

Screenshots

image

Environment

  • Unity Version: 2022.3.11
  • Operating System: Windows

Additional Context

The errors occurred using the recent release of the package, version 3.1.1, dated 2023-09-21.

Is this sdk for real user or unity-made 3p character

Hi. I love this meta movement sdk. I try to use it with my project. Comparing this video(https://www.youtube.com/watch?v=T8sYk2rwmcs) with content of this link(https://developer.oculus.com/documentation/unity/move-body-tracking/), even though the video shows that this sdk can be used for real user body tracking who wears meta quest2, I think the content of the document means this sdk is for unity 3p character not for oculus quest2 user. Is it right? Or this sdk also can be used with real user wearing the meta quest2?

Geometry issues on PC version

The face has geo issues on the PC demo. It shows a crack in the face on the right side only and only when doing some expressions / angles. Not really important for development / sdk. But Just thought I would give the Meta team a heads up in case they want to fix it.

Untitled

com.oculus.bodyapiservice stops working

Having closely followed the instructions for setting up the samples, whenever we try to run the scenes over oculus link the error com.oculus.bodyapiservice keeps stopping appears. This has been tested on a quest 2 and a pro headset, with both cable and airlink. In the oculus app the developer mode beta settings for face and eye tracking are enabled. The scene then proceeds to run but without face or eye tracking. No obvious warnings or errors appear in the Console log.

No input response

I ran and built this app following the instructions in the readme and the configuration page. The first issue i faced was, i could not open any of the scenes inside Samples because it said they were part of a read only package.

I solved this by copying all the scenes from the samples folder into a new folder Scenes/ inside my Assets. I then added all these scenes in the build settings and built the app for Android (which was successful)

However, when i run the apk inside a quest Pro, i am able to open and run the app but it is not detecting any input (i am not able to change the menu). It is detecting the controllers and my hands but when I press any buttons or do perform any input, there is no effect.

Sample Scenes do not work in editor.

I am attempting to run the sample scenes in editor and get the following errors:
1.) Global joint set is invalid. Ensure that there is an OVRBody instance that is active an enabled with a valid joint set
UnityEngine.Debug:LogError (object)
OVRPlugin:GetSkeleton2 (OVRPlugin/SkeletonType,OVRPlugin/Skeleton2&) (at ./Library/PackageCache/[email protected]/Scripts/OVRPlugin.cs:8350)
OVRSkeleton:Initialize () (at ./Library/PackageCache/[email protected]/Scripts/Util/OVRSkeleton.cs:409)
OVRSkeleton:Start () (at ./Library/PackageCache/[email protected]/Scripts/Util/OVRSkeleton.cs:373)

2.)Global joint set is invalid. Ensure that there is an OVRBody instance that is active an enabled with a valid joint set
UnityEngine.Debug:LogError (object)
OVRPlugin:GetSkeleton2 (OVRPlugin/SkeletonType,OVRPlugin/Skeleton2&) (at ./Library/PackageCache/[email protected]/Scripts/OVRPlugin.cs:8350)
OVRSkeleton:Initialize () (at ./Library/PackageCache/[email protected]/Scripts/Util/OVRSkeleton.cs:409)
OVRSkeleton:UpdateSkeleton () (at ./Library/PackageCache/[email protected]/Scripts/Util/OVRSkeleton.cs:624)
OVRSkeleton:Update () (at ./Library/PackageCache/[email protected]/Scripts/Util/OVRSkeleton.cs:617)

2 seems to happen every frame. But when I build the scenes to a headset the tracking works and nothing seems wrong. This is all tested in a new project using Unity 2023.1.17f1.

Issue of "BoxProximityField"

Hi,

When I run the package in unity I got some error which all said 'Oculus.Interaction.Deprecated.Box Proximity Field' is missing the class attribute 'ExtensionOfNativeClass'!, I think that's because you use "BoxProximityField" class in "UITextButton" prefab and "UIToggleButton" prefab.

However when I checked the "Oculus" package, it mentioned the "BoxProximityField" class was replaced by "ClippedPlaneSurface" class, I try to delete the "BoxProximityField" and add "ClippedPlaneSurface", the program can run without error but can not be shown into the quest pro, I'm not sure if this is because I modified the script component.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.