Git Product home page Git Product logo

unity-startersamples's People

Contributors

alexthiv avatar facebook-github-bot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

unity-startersamples's Issues

Issues with Scene API

Hi, I'm using scene api in quest3. But scene api doesnt work in built apk, while does work in quest link.
Do you know the reason? Appreciate it!

Wide Motion Mode doesn't work when using OpenXR

Description

When switching to the OpenXR Plugin wide motion mode does not work.

Using the Oculus XR Plugin wide motion mode works as expected.

Setup

Unity: 2021.3.36f1
Packages: Meta XR Core SDK v 62.0.0; OpenXR Plugin 1.10.0; XR Plugin Management 4.4.0
Device: Quest 3, OS version 63.0.0.399.366

Repro

With no other modifications to the code or project configuration:

  • Open the HandsInteractionTrain Scene
  • Find the OVRCamera Rig and enable "Wide Motion Mode Hand Poses Enabled" and add the Body Tracking Support permissions
  • Save the scene
  • Build using Oculus > Samples > Build Starter Scene and deploy to device
  • Run the app and navigate to the HandsInteractionTrain Scene
  • Raise your hands so they are detected, then place your hands by your waist looking down, then move your hands around your back.
  • You should observe that the hands stay visible even though they are no longer visible from the device.
  • Back in Unity, install the OpenXR Plugin package
  • Under Project Settings > XR Plug-in Management switch Plug-in Provider to OpenXR
  • Build using Oculus > Samples > Build Starter Scene and deploy to device
  • Run the app and navigate to the HandsInteractionTrain Scene
  • Raise your hands so they are detected, then place your hands by your waist looking down, then move your hands around your back.
  • This time note that the hands will disappear as soon at the device can no longer see them

Expected Results

Your hands should remain enabled even when the device can't see them when using Wide Motion Mode with the OpenXR plugin

Actual Results

Wide motion mode does not work when running OpenXR

Workarounds

The cause of this bug seems to be due to the missing OpenXR extension XR_META_hand_tracking_wide_motion_mode. You can create a custom OpenXR feature (as mentioned in this issue #12) which includes that extension and the call to Wide Motion Mode now works

Issues with Scene and Anchor Samples

Ran into issues with the Unity-Discover reference project that are related to some of the starter samples. Mentioned it here: oculus-samples/Unity-Discover#11

With Starter Samples, trying to run SceneManager or Bouncing ball via Link in Editor fails. Scene manager just loads passthrough with no volumes. Bouncing ball similarly shows no volumes and balls simply drop out to infinity.

With the SpatialAnchor sample, i am able to create and save an anchor (although saving does throw a warning in console, but the green save icon appears). Trying to load anchors afterwards fails. This is the error that shows up:

image

Unity Package Manager Error [One or more packages could not be added to the local file system]

Hi,

When opening the package in Unity 2022.3.11f1, I get the following error:

Unity Package Manager Error

An error occurred while resolving packages:
One or more packages could not be added to the local file system:
com.meta.xr.sdk.core: Request [GET https://download.packages.unity.com/com.meta.xr.sdk.core/-/com.meta.xr.sdk.core-59.0.0.tgz] failed with status code [502]
com.meta.xr.sdk.platform: Request [GET https://download.packages.unity.com/com.meta.xr.sdk.platform/-/com.meta.xr.sdk.platform-59.0.0.tgz] failed with status code [502]

Click on Retry to relaunch Unity and reopen your project.
![Screenshot 2023-12-11 at 14 50 17](https://github.com/oculus-samples/Unity-StarterSamples/assets/11411924/4c9a4b4c-fa6b-4148-bfcf-784eaefa2fb3)

Any idea how I could fix it?

Thanks!

Hands are in the wrong angle

Just downloaded the version for SDK 63.0.0.

In all scenes with hand tracking, I get the hands at the wrong angle,
Something like 90 degrees wrong in all directions (or something like that.

I do see the hand icons in the right place when I turn my hands though.

Errors during the Build Starter Scene

Hi, I am following the documentation on developer.oculus.com.
Unfortunately I am receiving a bunch of errors after launching the Oculus -> Samples -> Build Starter Scene

Unity 2021.3.26f1 Apple Silicon
The first error looks like this:

'[Oculus.VR]OVRManager' has an extra field 'expandMixedRealityCapturePropertySheet' of type 'System.Boolean' in the player and thus can't be serialized (expected 'launchMultimodalHandsControllersOnStartup' of type 'System.Boolean')
UnityEditor.BuildPipeline:BuildPlayer (UnityEditor.BuildPlayerOptions)
OculusBuildSamples:Build (string,string[]) (at Assets/StarterSamples/Editor/BuildSamples.cs:183)
OculusBuildSamples:BuildStartScene () (at Assets/StarterSamples/Editor/BuildSamples.cs:138)

image

My apologies for bothering you, but please can you help me to find out the root cause?

OVROverlay example Depth Buffer Testing doesn't work with OpenXR plugin

Description

When switching to the OpenXR Plugin the overlay using depth buffer testing in the OVROverlay scene sample no longer composites correctly vs other objects in the scene.

Using the Oculus XR Plugin the compositing works as expected.

Setup

Unity: 2021.3.36f1
Packages: Meta XR Core SDK v 62.0.0; OpenXR Plugin 1.10.0; XR Plugin Management 4.4.0
Device: Quest 3, OS version 63.0.0.399.366

Repro

With no other modifications to the code or project configuration:

  • Open the OVROverlay.unity scene
  • Add a single default cube to the scene, ensure it sits between the camera and the OverlayUIGeometry object
  • Build using Oculus > Samples > Build Starter Scene and deploy to device
  • Select the OVROverlay scene from the starter scene menu
  • Switch between 'OVROverlay' and 'Application' in the menu
  • Note that the cube sorts correctly in front of the background images
  • Back in Unity, install the OpenXR Plugin package
  • Under Project Settings > XR Plug-in Management switch Plug-in Provider to OpenXR
  • Build using Oculus > Samples > Build Starter Scene and deploy to device
  • Select the OVROverlay scene from the starter scene menu
  • Switch between 'OVROverlay' and 'Application' in the menu

Expected Results

The cube sorts correctly in front of the background images, as it does with the Oculus XR plugin.

Actual Results

In 'Application' mode the background images are rendered by the WorldVsOverlayComparison > WorldspaceUIGeometry object rendering on the Default layer directly in to the main camera. In this mode the cube sorts correctly against this geometry.

In 'OVROverlay' mode the background images are rendered by the WorldVsOverlayComparison > OverlayUIGeometry object rendering on the OVROverlayUI layer in to an orthographic camera writing to a render texture. The render texture is then set as the source texture for an OVROverlay component on the OVROverlayQuad-DisplayCamRT object with 'Enable Depth Buffer Testing' flag set. This overlay should sort in front of the cube, but instead the overlay is rendered over the top of the cube.

Does not work out of the box

I have downloaded fresh install of Unity 2022.3.17f1. Created Unity Projects using multiple Unity templates as the starting base then brought the Meta XR All-in-One SDK into all of them along with git cloning this repo then open the samples such as passthrough example. (All developer runtime features are enabled in the Oculus App, such as Passthrough over link, etc)

Oculus link cable is connected and I can see the PCVR home environment. I open the passthrough scene and all the sample scenes from this repo no problem but when I click play in the editor I only get this as error on all the sample scenes:

Unable to start Oculus XR Plugin.
Possible causes include a headset not being attached, or the Oculus runtime is not installed or up to date.
If you've recently installed or updated the Oculus runtime, you may need to reboot or close Unity and the Unity Hub and try again.
UnityEngine.Debug:LogWarning (object)
Unity.XR.Oculus.OculusLoader:Initialize () (at ./Library/PackageCache/[email protected]/Runtime/OculusLoader.cs:182)
UnityEngine.XR.Management.XRGeneralSettings:AttemptInitializeXRSDKOnLoad () (at ./Library/PackageCache/[email protected]/Runtime/XRGeneralSettings.cs:148)

Are the MR features being blocked for PCVR in even the Unity Editor? Everything is set correctly, under project settings>Oculus, all items are fixed. This does not work out of the box. My Quest 3 works fine in PCVR via official link cable as I stated.

Specs: Quest 3, 4090, 7800X3D, 16GB. also a Quest Pro I have.

I have no issues running with just regular OpenXR with the Quest 3 on other Unity projects for VR support, or using Virtual Desktop's passthrough feature with the other projects, but I need some the interactions this repo provides for hands, passthough, etc. to get an idea of how this will all work for eventual standalone conversion. I really don't want to rely on a Virtual Desktop cutout greenscreen AR though, wanted to experiment also with Quest Pro eye tracking features, and also trying to cut and paste things from this repo would not be ideal for me. It's easier for my to bring my stuff into your samples. (not exporting the assets and reimporting as per your instructions) Also out of principle I don't think it's fair to rely on one guy Ggodin (the Virtual Desktop Dev) to be a crutch for people stuck in PCVR development sphere that want to explore standalone MR.

I was going to convert an app of mine for mixed reality but wanted to try it in the PC editor (I realize now that you cannot do PCVR builds with MR features or eye tracking etc (which would have been beneficial to gather feedback from PCVR users on discord) a big meh there.. but I can work around it if I can use editor on PC with MR features for myself at least.

My application is pretty graphically intense and only works for PCVR at this time. It does monoscopic video and image to 3D conversation using models such as either lightweight MIDAS, or BIET, or the new Marigold depth estimation model, etc. but could potentially run on standalone.

I did find a section under the Oculus menu (top)>platform>edit settings and it says something about login, if that is in some way required through API access to mess around with passthrough in the just the Unity editor and require me to send an APK to the headset to test my app, then I would not pursue the project because there is no way it would run on standalone yet without major rework, I need to test it out first with all the Meta SDK features in the Editor.

My feedback to Boz, Mark, or whoever on the team makes these decisions... This all needs to work a bit better/smoother and be more streamlined on the dev level, definitely more open and better PCVR support if it is indeed locked down to a standalone requirement for MR, it's hindering creative development, and I can't develop in the Unity Editor in realtime with the mixed reality features currently, I have tried on 2 separate systems here. Now of course there could be something I'm missing here, and I apologize if I'm mistaken, but this is what I am seeing so far.

Also side question: why is the PCVR app still called the "Oculus" app and why is there no 120hz support there over the link cable but there is in Virtual Desktop? IMO, this all needs to open up or there is no trickle down effect to standalone. My situation here can't be original.

Spatial Anchor saving sample doesn't work with OpenXR plugin

Note: May be related to #8

Description

When switching to the OpenXR Plugin the Spatial Anchor scene sample is no longer able to save anchors. This is the same regardless of leaving the sample saving locally or switching to OVRSpace.StorageLocation.Cloud.

Using the Oculus XR Plugin both Local and Cloud saving works as expected.

Setup

Unity: 2021.3.36f1
Packages: Meta XR Core SDK v 62.0.0; OpenXR Plugin 1.10.0; XR Plugin Management 4.4.0
Device: Quest 3, OS version 63.0.0.399.366

Repro

With no other modifications to the code or project configuration:

  • Install the OpenXR Plugin package
  • Under Project Settings > XR Plug-in Management switch Plug-in Provider to OpenXR
  • Build using Oculus > Samples > Build Starter Scene
  • Deploy to device
  • Select Spatial Anchor from the start scene menu
  • Create an anchor
  • Select the anchor
  • Attempt to Save anchor

Expected Results

Anchor saves, green icon is shown and on re-running the app and selecting Load Anchors the anchor is loaded in the same location with the same GUID.

Actual Results

Nothing happens and green icon doesn't show. On re-running the app and selecting Load Anchors the previously created anchor is not visible.

Checking the logs the following error is printed when attempting to save the anchor:
[SaveSpaceList] m_XR_FB_spatial_entity_storage_batch extension is not available (arvr/projects/integrations/OVRPlugin/Src/Util/CompositorOpenXR.cpp:12734)

Issue saving SpatialAnchor locally

I cloned the repo and without changing anything I built the SpatialAnchor scene with Unity 2022.3.8. I tried saving anchors locally and it wouldn't work. On the log I saw that the OVRSpatialAnchor.Save function returns false for success, with no more information.

I changed the SaveOptions to StorageLocation.Cloud on OVRSpatialAnchor to try saving to the Cloud, and it worked correctly.

Is there some kind of setting or write permission that I'm not setting correctly preventing me from saving locally?

OVRManager.IsPassthroughRecommended() always returns false when using OpenXR Plugin

Description

When switching to the OpenXR Plugin OVRManager.IsPassthroughRecommended() always returns false.

Using the Oculus XR Plugin the OVRManager.IsPassthroughRecommended() returns true or false depending on whether passthrough is enabled in the home screen

Setup

Unity: 2021.3.36f1
Packages: Meta XR Core SDK v 62.0.0; OpenXR Plugin 1.10.0; XR Plugin Management 4.4.0
Device: Quest 3, OS version 63.0.0.399.366

Repro

Add the following script to the sample project

using TMPro;
using UnityEngine;

public class IsPassthroughPreferred : MonoBehaviour
{

    TextMeshPro _text;

    // Start is called before the first frame update
    void Start()
    {
        var go = new GameObject("Debug Text");

        _text = go.AddComponent<TextMeshPro>();
        var rectTransform = go.GetComponent<RectTransform>();
        rectTransform.position = new Vector3(0, 0.8f, 1);
        rectTransform.localScale = new Vector3(0.02f, 0.02f, 0.02f);
        rectTransform.sizeDelta = new Vector2(50, 5);
        OVRManager.instance.isInsightPassthroughEnabled = true;
    }

    // Update is called once per frame
    void Update()
    {
        _text.text = $"Passthrough Preferred {OVRManager.IsPassthroughRecommended()}";
    }
}
  • Add the script to any game object in the StartScene.unity
  • Save the scene
  • Build using Oculus > Samples > Build Starter Scene and deploy to device
  • Ensure that the device has passthrough on in the OS
  • Run the app and observe the text field saying "Passthrough Preferred True"
  • Back in Unity, install the OpenXR Plugin package
  • Under Project Settings > XR Plug-in Management switch Plug-in Provider to OpenXR
  • Build using Oculus > Samples > Build Starter Scene and deploy to device
  • Run the app and observe the text field saying "Passthrough Preferred False"

Expected Results

OVRManager.IsPassthroughRecommended() returns a value that reflects the passthrough setting in the OS, as it does with the Oculus XR plugin.

Actual Results

OVRManager.IsPassthroughRecommended() always returns false

Workarounds

The cause of this bug seems to be due to the missing OpenXR extension XR_META_passthrough_preferences. You can create a custom OpenXR feature (as mentioned in this issue #12) which includes that extension and the call to OVRManager.IsPassthroughRecommended() now works

[SceneManager] Desk prefab is upside down when rendering

I'm trying the scene SceneManager. It loads successfully but with the desk prefab upside down.
I tried to rotate it when the event OnSceneModelLoadedSuccessfully is invoked but it seems like it resets its transform.
Is there a way to fix it?

Multiple bugs with HandsInteractionTrainScene

I want to try the new Meta UPM packages and this examples repo and I encounter multiple issues with the HandsInteractionTrainScene.

Poke interactors (blue spheres) don't follow finger tips (only very occasionally the poke spheres get correctly updated to the finger tips) which makes it hard to press the buttons.

Skybox and button highlight shader doesn't work with Stereo Rendering Instancing (as recommended by the Oculus Project Setup Tool)

Lightmaps are broken
2024-01-03_14-57-50

I encounter these issues in a new Unity 2022.3.11f1 (Built-in render pipeline) project with Meta XR All-in-One SDK 60.0.0 .

OVRPlugin.GetSystemHeadsetType() always returns Oculus_Quest_2 when using OpenXR Plugin

Description

When switching to the OpenXR Plugin OVRPlugin.GetSystemHeadsetType() always returns Oculus_Quest_2 even when using other devices such as the Quest 3

This is significant because when using the interaction package this function call is used to determine which controllers to show the users by OVRControllerHelper.

Using the Oculus XR Plugin the OVRPlugin.GetSystemHeadsetType() returns the correct headset type.

Setup

Unity: 2021.3.36f1
Packages: Meta XR Core SDK v 62.0.0; OpenXR Plugin 1.10.0; XR Plugin Management 4.4.0
Device: Quest 3, OS version 63.0.0.399.366

Repro

Add the following script to the sample project

using TMPro;
using UnityEngine;

public class GetSystemHeadsetType : MonoBehaviour
{

    TextMeshPro _text;

    // Start is called before the first frame update
    void Start()
    {
        var go = new GameObject("Debug Text");

        _text = go.AddComponent<TextMeshPro>();
        var rectTransform = go.GetComponent<RectTransform>();
        rectTransform.position = new Vector3(0, 0.8f, 1);
        rectTransform.localScale = new Vector3(0.02f, 0.02f, 0.02f);
        rectTransform.sizeDelta = new Vector2(50, 5);
    }

    // Update is called once per frame
    void Update()
    {
        _text.text = $"GetSystemHeadsetType {OVRPlugin.GetSystemHeadsetType()}";
    }
}
  • Add the script to any game object in the StartScene.unity
  • Save the scene
  • Build using Oculus > Samples > Build Starter Scene and deploy to device
  • Run the app on a Quest 3 and observe the text field saying "GetSystemHeadsetType Meta_Quest_3"
  • Back in Unity, install the OpenXR Plugin package
  • Under Project Settings > XR Plug-in Management switch Plug-in Provider to OpenXR
  • Build using Oculus > Samples > Build Starter Scene and deploy to device
  • Run the app and observe the text field saying "GetSystemHeadsetType Oculus_Quest_2"

Expected Results

OVRPlugin.GetSystemHeadsetType() returns the headset type the app is running on, as it does with the Oculus XR plugin.

Actual Results

OVRManager.IsPassthroughRecommended() always returns Oculus_Quest_2

Workarounds

The cause of this bug seems to be due to the missing OpenXR extension XR_META_headset_id. You can create a custom OpenXR feature (as mentioned in this issue #12) which includes that extension and the call to OVRPlugin.GetSystemHeadsetType() now works

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.