Git Product home page Git Product logo

dkvfx's Introduction

Dkvfx

Screenshot

This is a Unity sample project that shows how to integrate a volumetric video recorded with Depthkit to a Visual Effect Graph.

This project requires Unity 2019.3.

How To Install

This package uses the scoped registry feature to resolve package dependencies. Please add the following sections to the manifest file (Packages/manifest.json).

To the scopedRegistries section:

{
  "name": "Keijiro",
  "url": "https://registry.npmjs.com",
  "scopes": [ "jp.keijiro" ]
}

To the dependencies section:

"jp.keijiro.dkvfx": "0.1.2"

After changes, the manifest file should look like below:

{
  "scopedRegistries": [
    {
      "name": "Keijiro",
      "url": "https://registry.npmjs.com",
      "scopes": [ "jp.keijiro" ]
    }
  ],
  "dependencies": {
    "jp.keijiro.dkvfx": "0.1.2",
...

dkvfx's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dkvfx's Issues

Multiple instances problem?

When I use multiple instances of the player/converter/vfx prefabs, each one updates with the new video. How can I use this with multiple instances and several different videos in the same scene?

Multiple Depthkit videos in one scene

Hi, I saw there is a closed case about this but I'm also having problems using multiple videos in one scene - e.g. 2 videos playing at the same with different graphs.

I have set everything up from scratch, but all the HAP players in my scene are referring the same converter, and there is only one set of color and position maps to use in VFX graph.

When you have multiple streams of Depthkit recordings playing at the same time, I'm guessing each clip should have it's own color and position maps? I can't think of another way to have multiple clips playing at the same time in different graphs.

Thanks,
Charlie

different videos/assets in one scene

Dear Keijiro,

first of all, thank you for your dedication and creativity in creating this effect. I have a question:

Is it possible to stream different videos / assets in one scene? I tried to duplicate the VFX effect, the hap player and the converter und connecting them to each other but it seems that the Hap-Player is always referring to the same converter so that all players in the scene are streaming the same video.

I would love to hear from you. Could not find any usefull adise by googling. Thanks in advance!

Vincent

Removal of HAP support for Mobile Support (ie Quest)

It'd be great if this repo had an example had a way to be free from HAP support to be driven by Unity Video Player or AVPro.

This would unlock mobile XR device support like Quest, Magic Leap, and various mobile XR platforms.

Metal: Error creating pipeline state (Hidden/Dkvfx/Convert): Link failed

Hi,

First of all thank you so much for all your creativity!

Now, been trying to get it to run for a while, but I get this error and black screen with nothing rendering:
"Metal: Error creating pipeline state (Hidden/Dkvfx/Convert): Link failed: The type of fragment input user(TEXCOORD0) does not match the type of the corresponding vertex shader output
(null)"

I'm on OSX 10.14.4 Unity 2018.3.6 - MBPro 2012 15"
But I also tried it on OSX 10.11, tried Unity 2019.1b, updated XCode - nothing changes
I've successfully ran your DepthkitTest and Unity HDRP test project on the same machine.

Thank you!

Hap conversion parameters

Hi Keijiro,

May I ask what tool/parameters are you using to convert depthkit exports to HAP?
I'm using ffmpeg from the MP4 that is exported from depthkit, like this:

ffmpeg -i input.mp4 -c:v hap output.mov

but somehow when I use this video (with the proper metadata), there is a strange warp in the pointcloud in Unity (as if the metadata and the depth data encoding to not match). I realized that if I export the video using Adobe Media Encoder this doesn't happen, so I wonder if there is some color encoding parameters that I'm missing...

Wrong depth decoding

Hi Keijiro,

I'm opening another issue related to artifacts on depth data. I have been having wrong depth decoding which I first thought was due to hap conversion, then depth clipping. Now I've tracked down the problem to the RGB2Hue function in depthkit's shader code, and I solved it, though it still puzzles me why this does not happen on your captures.

Your capture:

My capture from Azure (has some strange depth segments):

If I replace the RGB2Hue function in Depthkits Common.hsls (I'm using these), the problem gets solved:

Posting this here in case someone runs into the same issues.

Multiple cameras

How can I add another camera to your project with the same computer?

Great job!

Jose

Using the DKvfx on my own projects

Hi Keijiro,
first of all thank you for the creative stuff. I'm really enjoying it.
I would really like to use the DKvfx with my own Depthkit footage on my own projects but i'm having some difficulties.
I've imported the HAP player package and also your dkvfx one to my project but the video isn't playing the right way. it is playing as a flat video and not as Volumetric footage.
I see the Depthkit dll in the project but i'm not sure how to connect it to the footage i've uploaded.
when i'm using your project evreything works great but when i tried to open my own it didn't work properly.
solving it will help me a lot and i will be very appreciate.

Thanks once again,

Itamar

Error appears after installing CinemaMachine from package manager inside Dkvfx project.

Hi,

I get these 3 similar errors after installing CinemaMachine from package manager:

C:\Program Files\Unity\2019.3.0f1\Editor\Data\Resources\PackageManager\BuiltInPackages\com.unity.ugui\Runtime\UI\Core\InputField.cs(1628,40): error CS0246: The type or namespace name 'Event' could not be found (are you missing a using directive or an assembly reference?)

C:\Program Files\Unity\2019.3.0f1\Editor\Data\Resources\PackageManager\BuiltInPackages\com.unity.ugui\Runtime\UI\Core\InputField.cs(1804,34): error CS0246: The type or namespace name 'Event' could not be found (are you missing a using directive or an assembly reference?)

C:\Program Files\Unity\2019.3.0f1\Editor\Data\Resources\PackageManager\BuiltInPackages\com.unity.ugui\Runtime\UI\Core\InputField.cs(1798,17): error CS0246: The type or namespace name 'Event' could not be found (are you missing a using directive or an assembly reference?)

Anyone as an idea of what could cause that and is there any fix possible ?

Thanks in advance !

Mathieu.

Metadata asset

Hi Keijiro,

I'm using your repo to make some effects on Depthkit footage from azure. I'm guessing your Metadata.asset file is from Kinect V2, as your depth image resolution is 512*424. I'm trying to copy the parameters from Azure into a new Medata file.
Depthkit conversion gives a metadata file from which I can copy:

  • depthFocalLength
  • depthImageSize
  • depthPrincipalPoint
  • farClip
  • nearClip
  • format (not needed)
  • clipEpsilon

from lines 18 to 24 of the Metadata scriptable object:
`

        material.SetVector("_Crop", pers.crop);
        material.SetVector("_ImageDimensions", pers.depthImageSize);
        material.SetVector("_FocalLength", pers.depthFocalLength);
        material.SetVector("_PrincipalPoint", pers.depthPrincipalPoint);
        material.SetFloat("_NearClip", pers.nearClip);
        material.SetFloat("_FarClip", pers.farClip);
        material.SetMatrix("_Extrinsics", pers.extrinsics);

`

Therefore, I'm missing extrinsics and crop. How would I get them?
(I tried using depthkit's sensor calibration json file, but their extrinsic values are not in matrix form).

Best,

Sergio

Is it possible to crop particles ?

Hi again,

I was wondering if it was possible to crop a section out of my volumetric capture ? In other words, I'm trying to get rid of the particles in front of my capture, I tried with the boundary box and adjusting it's size but it didn't seemed to change anything.

Here's an image of what i'm trying to do:

image

should it work in VR?

Dear m.r Takahshi,

Inside my Unity project I can see the clip with the VFX perfectly fine in VR but when I build the project and try it, I cant see the clip and I see only a wall with the color I picked inside the VFX GRAPH.
Any Idea?

Best way to use Dkvfx with a modified version of KlakHap

Hi,

I want to introduce slight modifications on KlapHap in order to use them with DkVfx. However, KlakHap is a git-based dependency of Dkvfx, so my changes will be overwritten. When I try to import KlapHap locally, apparently DkVfx does not "see" KlakHap's namespace. What should be the best way of using a modified version of KlakHap with Dkvfx?

VFX depth strange offset

Hi,

I've made a video of a hand in depthkit - but am finding that whatever I try I have some weird issues with the depth of the vfx. (Maybe similar to the opened issue "Wrong depth decoding").

The first screenshot shows the video (of a hand), and the second screenshot shows the vfx. There is always a smaller second vfx particle cluster which is offset.

I've created a new metadata asset, & kept the Color & Position rendertexture the same except for the size, which I changed to match the video size, which is 752 x 1184.

Do you know what I could do?

Screenshot_DepthkitVideo

Screenshot_VFX

Multi-Sensor videos

Hi!
I was wondering if this setup could theoretically work with multi-sensor videos? Would I need to create 10 different instances of the Converter + Render Textures? Should I be able to use the same one?

I'm trying to test with a 10-camera setup coming out of DepthKit and I'm unsure what exactly is wrong, if it's the metadata or my setup.

Thanks!

Streaming Assets Folder not recognizing HAP files: "Failed to Open Stream"

Hi all, this is my first github issue or forum post ever so my apologies if query doesn't follow the proper etiquette.

//My machine is an MSI GS63VR with a geforce gtx 1060 and I am running Unity 2019.1.0f2//

I've used ffmpeg to encode my depthkit .mp4 file into HAP codec, dragged that file into the Streaming Assets folder and changed the file name in the HAP Player to my file name. I've also converted depthkit metadata and dragged that into the Converter's metadata parameter.

The Hap Player returns a warning in the UI that states "failed to open file. Please specify a valid HAP-encoded .mov file". The console also returns an error that states "failed to open stream (). Unity.Engine.Debug:LogError(object)."

I thought this was probably because I wasn't using ffmpeg properly but I have re-installed ffmpeg several times now and double checked that it is setup to encode HAP following this link.

Also, when I try to revert back to the Test video and test.meta, I receive the same alert and error and see the same sort of generic rectangular vfx particles, so it seems maybe that something else is going on?

Lastly, I've had some trouble using the Git support on package manager. I've looked through the forum and installed mob-sakai's UPM Git Extension package, which seems to help.

Unity 2019 1 0f2 Personal -  PREVIEW PACKAGES IN USE  - Test unity - Dkvfx-master - PC, Mac   Linux Standalone_ DX11 5_30_2019 4_56_01 PM
Unity 2019 1 0f2 Personal -  PREVIEW PACKAGES IN USE  - Test unity - Dkvfx-master - PC, Mac   Linux Standalone_ DX11 5_30_2019 4_55_46 PM
Unity 2019 1 0f2 Personal -  PREVIEW PACKAGES IN USE  - Test unity - Dkvfx-master - PC, Mac   Linux Standalone_ DX11 5_30_2019 4_55_39 PM
Unity 2019 1 0f2 Personal -  PREVIEW PACKAGES IN USE  - Test unity - Dkvfx-master - PC, Mac   Linux Standalone_ DX11 5_30_2019 5_04_24 PM

Does the DKVFX can also work with the URP?

Hi Keijiro,

I'm working on a VR project combining Volumetric Videos.
I'm creating my project with the Universal render pipeline and really want to use the VFX graph.
I've add the repositories and everything and I'm not getting any errors but it's also don't work...

Thanks a lot an advance

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.