Git Product home page Git Product logo

mrtunnelingpico's Introduction

This repository contains the Unity C# code for creating mixed reality tunneling effects, a method to balance the trade-off between the limited render performance and high visual quality of video see-through head-mounted displays (VST-HMDs) through fusing images of two types of camera sensors with different resolutions and frame rates.

It also demonstrates how we could convert a gray-scale low resolution VST of a VR headset to an untethered HD colorful VST HMD using sensor fusion technique.

untethered demo untethered demo

Feature 1: Basic MR Tunneling Effects

We merge a color video stream from an external stereoscopic camera with the grayscale VST from the VR headset. The external high-resolution VST displayed at the central foveal to the para-peripheral region of the human visual field complements the low-resolution, low-latency grayscale VST at the far peripheral region, producing a tunneling effect, which simulates the human foveal and peripheral vision, with the potential to reduce cybersickness as in the tunneling effect in immersive VR.

Feature 2: MR Tunneling with Head Speed Accomodation

MR Tunneling with head speed detection accommodates the user’s head movement speed by fading out the external VST when fast head movements are detected, thus potentially compensating for video streaming latency.

untethered demo

Feature 3: Foveated MR Tunneling Effects

The foveated MR Tunneling effect displays the center of the external VST based on the tracked user eye movements.

untethered demo

Software Version

  1. VR Headset: The project was implemented targeting the Pico Neo 3 Pro Eye standalone VR headset with the Pico Unity SDK version 2.0.4.

  2. Stereo Camera: We use the ZED Mini mixed reality stereoscopic camera with the ZED SDK version 3.7.3.

  3. Unity: This project was tested on Unity version 2021.3.4f1.

  4. Gstreamer Unity Plugin : This Unity project extends the mrayGstreamerUnity (https://github.com/mrayy/mrayGStreamerUnity) project, and makes it compatible with any arm64 Android device. As the results, the plugin is compatible with other standalone VR headsets such as the Oculus Quest 2.

  5. Note: the user interface components are not included in the open source repository as it requires a paid Unity asset the Unity VR UI toolkit.

Accelerated G-Streamer Pipeline

In order to minimize the video streaming and rendering latency ( what is also called the motion to photon latency ), the Gstreamer pipeline used for this project is optimized using the Nvidia Deep Stream SDK, which can drastically decrease the video encoding latency.

The Gstreamer pipeline was tested on a ZED Box, running on Ubuntu 18.04.6 LTS, Jetpack 4.6, and ZED SDK version 3.7.3, Gstreamer version 1.14.5, and the ZED-Gstreamer plugin. The average MTP latency was around 150 ms for pipeline1 using a dedicated Wifi-6 router for streaming.

To use the pipeline, connect the VR headset and the streaming device to the same local network.

Pipeline1: streaming HD720 stereoscopic video to the standalone Pico VR headset at 30 fps

 gst-launch-1.0 zedsrc camera-resolution=2 camera-fps=30 stream-type=2  ! autovideoconvert format=BGRA ! queue !  nvvideoconvert ! nvv4l2h264enc  maxperf-enable=1 control-rate=1 bitrate=8000000 vbv-size=530000 ! 'video/x-h264, format=(string)BGRA, stream-format=(string)byte-stream, width=(int)1280, height=(int)1440,  preset-level=(int)1' ! h264parse ! rtph264pay config-interval=-1 pt=96  ! udpsink clients=[VR_HEADSET_IP]:30000 max-bitrate=80000000000 sync=false async=false 

Test pipeline: streaming HD720 with real-time computer vision and object detection features at 15fps

gst-launch-1.0 zedsrc stream-type=2 od-enabled=true od-detection-model=0 camera-resolution=2 camera-fps=15 ! queue ! zeddemux is-depth=false  name=demux demux.src_left! queue ! zedodoverlay ! autovideoconvert ! omxh264enc! 'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! rtph264pay config-interval=-1 pt=96 ! queue ! udpsink clients=[VR_HEADSET_IP]:30000 max-bitrate=30000000 sync=false async=false demux.src_aux ! queue ! zedodoverlay ! autovideoconvert  ! omxh264enc !  'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! rtph264pay config-interval=-1 pt=96 ! queue ! udpsink clients=[VR_HEADSET_IP]:30000 max-bitrate=30000000 sync=false async=false

For more implementation details of building an untethered VR device from a standalone VR headset via sensor fusion, please check out the publication.

Citation

If you find this repository useful for your research and work, please cite this work:

@article{Li2022MixedRT:inpress,
  title={Mixed Reality Tunneling Effects for Stereoscopic Untethered
Video-See-Through Head-Mounted Displays},
  author={Ke Li and Susanne Schmidt and Reinhard Bacher and Wim Leemans and Frank Steinicke},
  journal={2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)},
  year={\noop{3001}in press}
}

Acknowledgement

This work was supported by DASHH (Data Science in Hamburg - HELMHOLTZ Graduate School for the Structure of Matter) with the Grant-No. HIDSS-0002, and the German Federal Ministry of Education and Research (BMBF).

mrtunnelingpico's People

Contributors

keli95566 avatar

Stargazers

Giovanni Rosato avatar  avatar  avatar  avatar Jan Seewald avatar  avatar Dmitriy Afanasev avatar  avatar Gurban avatar  avatar

Watchers

 avatar

Forkers

ke1ynoc

mrtunnelingpico's Issues

Only videotestsrc can be played

I compiled this repository and managed to run it on Oculus Quest 3 and 2 (Android). I can play videotestsrc smoothly in the pipeline. However, I can't play rtspsrc. What could be the reason for this? What should I do? I'm leaving the pipeline I'm using below, maybe you can provide some insight:

string pipeline = "rtspsrc location=rtsp://ipaddress:554 ! rtph264depay ! video/x-h264, stream-format=byte-stream ! h264parse ! video/x-h264, stream-format=byte-stream, alignment=au ! avdec_h264 ! videoconvert";
m_Texture.SetPipeline(pipeline + $" ! video/x-raw, format=BGRA, width={width}, height={height * 2} ! appsink name=videoSink");

Additionally, I can run this pipeline via cmd and receive images.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.