Git Product home page Git Product logo

ouvrt's Introduction

ouvrt

  1. About
  2. Setup and build
  3. ouvrtd
  4. Tools
  5. Todo

1. About

ouvrt is a playground to understand how the positional tracking systems used by the PlayStation VR, Oculus Rift (DK2, CV1), and HTC Vive virtual reality headsets work. The main component is the ouvrtd daemon that detects and opens relevant USB devices and sets them up for tracking.

Currently the following devices can be detected:

  • Lenovo Explorer headset (USB)
  • PlayStation VR headset (USB, via PSVR processing box)
  • Rift CV1 headset (USB)
  • Rift DK2 headset (USB)
  • Rift DK2 Positional Tracker (USB)
  • Rift remote (wireless via Rift CV1 headset)
  • Rift touch controller (wireless via Rift CV1 headset)
  • Vive headset (USB)
  • Vive base station (optical via Vive headset or controller)
  • Vive controller (USB or wireless via Vive headset)

Features are still limited to enabling the tracking LEDs for PlayStation VR, Rift DK2 and CV1, setting up the camera sensor for synchronized exposure (DK2 only), and capturing video frames into a GStreamer pipeline for debugging (DK2). IMU sensor data is captured and can be sent along with axis and button state via UDP.

2. Setup and build

The following prerequisite libraries and development packages are necessary to build ouvrt:

  • GLib/GObject/GIO
  • GStreamer (optional)
  • JSON-GLib
  • OpenCV (optional)
  • libudev
  • Linux kernel headers (hidraw, uvc, v4l2)
  • Meson
  • zlib
  • PipeWire (optional)

On a Debian stretch system these can be installed with the following commands:

$ apt-get install build-essential libglib2.0-dev libjson-glib-dev \
  libudev-dev meson pkg-config

And optionally:

$ apt-get install libgstreamer-1.0-dev
$ apt-get install libopencv-dev
$ apt-get install libpipewire-0.2-dev libspa-lib-0.1-dev

To configure the build system and build everything, follow the standard Meson build procedure:

$ meson builddir
$ cd builddir
$ ninja

To build without an optional dependency, disable the corresponding option before calling ninja, for example:

$ cd builddir
$ meson configure -D gstreamer=false -D opencv=false -D pipewire=false

3. ouvrtd

Make sure you have permissions to access the /dev/hidraw and /dev/video devices corresponding to the Rift DK2 and the DK2 Positional Tracker. Then run ouvrtd:

$ ./ouvrtd

If compiled with PipeWire support, the daemon will create a PipeWire stream for each camera. An example camera observer Python script using the PipeWire GStreamer plugin to show all cameras is included in the scripts directory:

$ apt-get install gstreamer1.0-pipewire
$ scripts/ouvrt-cameras.py

If compiled with GStreamer support, the daemon will create a shared memory socket /tmp/ouvrtd-gst-0 and, if a DK2 Positional Tracker is connected, write frames into it as soon as a GStreamer shmsrc connects to it. To see the captured frames, run:

$ gst-launch-1.0 shmsrc socket-path=/tmp/ouvrtd-gst-0 is-live=true ! \
  video/x-raw,format=GRAY8,width=752,height=480,framerate=60/1 ! \
  videoconvert ! autovideosink

4. Tools

The dump-eeprom tool reads the Positional Tracker DK2 EEPROM and writes it to a file or stdout:

$ ./dump-eeprom camera-dk2-rom.bin

$ ./dump-eeprom - | hexdump -C

5. Todo

  • Add blob detection and tracking
  • Enable Rift DK2 IR LED blinking patterns
  • Add individual blinking LED detection to the blob tracker
  • Add a 3D model of the tracking LEDs, readout from the Rift
  • Add support for camera intrinsic and lens distortion parameters, readout from the Rift and/or camera EEPROM
  • Add a PnP solver to estimate the pose from 3D-2D point correspondences
  • Add RANSAC PnP solver support for the initial pose estimation
  • Feed the projection of the estimated pose back as starting points for the PnP solver and blob tracker
  • Add Rift DK2 IMU support to estimate the pose from integrated gyro and acceleration sensor readouts
  • Add sensor fusion, correcting the IMU pose from the camera pose regularly, use the fused pose estimate to feed back into PnP solver and blob tracker
  • Implement proper time handling for all of this

ouvrt's People

Contributors

ph5 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ouvrt's Issues

snapshot mode problem

Hi friend,
I got a problem during debug mt9v034 snapshot mode in another project . And see you code have used snapshot mode, I'd like to ask if you have the same problem.

Firstly I initialize the sensor, set the sensor to snapshot mode (r0x07set to 0x0398), 
and turn off  aec / agc function (r0xaf set to 0x0000).
Then give trigger pulse to ex-posure pin and read out an image.

The problem is:
When the trigger cycle less than about 0.5s, the read out image looks more normal,
When the trigger interval is growing, the image appears more and more white(color) noise, (See attachment)
When the trigger interval time reaches a few minutes, the image is basically a white paper.

In addition, through observation and experiment also found:
1, the location of the noise completely fixed.
2, in the idle time period standby sensor can not solve this problem.
3, from the point of view, like in the idle time, sensor has been exposed,
but by observing led_out pin we found exposure time is correct.
4, I tried similar initialization code in your β€œint mt9v034_sensor_enable_sync (int fd)” function , get the same result.

So I submit the issue to ask for your help.

Looking forward to your reply, thx!~

0.5s
0 5s
1s
1s
2s
2s
5s
5s
10s
10s
30s
30s
60s
60s
600s
600s
Related video
20170825154727.zip

Samsung Odyssey plus gstreamer weird image, pipewire working fine

Tried on a Samsung Odyssey plus. Build with gstreamer and pipewire disabled.

udev: Found HoloLens Sensors: /dev/bus/usb/008/003
HoloLens Sensors: Serial e146fa9d-fd47-4b81-97cd-11745667db51
HoloLens Sensors: USB3
HoloLens Sensors: acquired new id 0 for serial e146fa9d-fd47-4b81-97cd-11745667db51
ouvrtd: Acquired session bus, name: 'de.phfuenf.ouvrt.Ouvrtd'
ouvrtd: Acquired name "de.phfuenf.ouvrt.Ouvrtd"
HoloLens Sensors: Missing frame: 0 -> 25
debug: connected
debug: disconnected

Odysseyplus

I recompilled with gstreamer disabled and pipewire enabled and it works.

udev: Found HoloLens Sensors: /dev/bus/usb/008/003
HoloLens Sensors: Serial e146fa9d-fd47-4b81-97cd-11745667db51
HoloLens Sensors: USB3
HoloLens Sensors: acquired new id 0 for serial e146fa9d-fd47-4b81-97cd-11745667db51
ouvrtd: Acquired session bus, name: 'de.phfuenf.ouvrt.Ouvrtd'
ouvrtd: Acquired name "de.phfuenf.ouvrt.Ouvrtd"
HoloLens Sensors: Missing frame: 0 -> 25

odyssey-gstreamer

There seems to be always missing frames 0->25.

Edit:
The first stream is 30 fps and works fine for gray video.
The second stream is 60 fps and is for tracking the controllers as it will only show bright led.
Seems like second stream could track PS move controllers !

Thanks for your work !
But still can't get O+ to display with openhmd.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.