Git Product home page Git Product logo

handposing's People

Contributors

github-bot avatar mephestokhaan avatar thestonefox avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

handposing's Issues

Consider finger snapshot autoadjust

Change the collision system so when generating a snap pose, the fingers automatically rotate back a bit in case they are intersecting a surface.

This would save a lot of time in doing adjustments when the tracking is not very stable.

Hand mirroring

Similar to #8 but changing from left and right hand. Is there a way to automate ambidextrous poses? This way we could just take a snapshot with the right hand and support also the left.

Consider IK

Right now we are using the raw angles of the hand tracking, but there is an argument in favour of using UK so we can generate poses where the fingertips align in the model and tracked data.
Most probable a more complex task than the others.

Double hand grab

Some objects should be grabbed with two hands. At the moment it just swaps from one to another

Try to automate as much as possible the puppeting

When defining the bone rotations and assignments from a tracked hand to a "rigged hand". Try to find a way to automate the whole process, maybe posing your ghost-tracked-hand over the controllable hand and match the closest transforms would be a great start.

Decouple OVR

OVR is used at the moment for its OVRSkeleton capabilities. Find a way to decouple it so other hand tracking solutions can be added such as Leap Motion or Hololens

Free up some bones

Some poses could free-up some bones, like leaving the pinky or thumb still tracked so it can be used for something else (like move it grabbing a joystick)

Unity 2020.3 Integration

Hi there, has I had to update my project to Unity 2020 as a requirement for an asset I am using.

Some of the required packages seem to have been removed from the 2020 package manager:

  * Oculus Android (2.38.6)
  * Oculus Desktop (2.38.4)
  * OpenVR Desktop (2.0.5)
  * XR Legacy Input Helpers (1.3.11)
  
- has this repo been callibrated to work in URP for Unity 2020.3.2f1 ?

Generate sample

Create a scene that illustrates all possible configurations

Can't find CanMove option

Hi again. After reading you wiki regarding snapping behaviours, I was trying to find the CAnMove option you talk about so I can leave the Grabbable dial in the same place when I grab it. I can't find it anywhere. Any tips?
Kind regards
J.

Full support with XR integration

At the moment this works directly with Oculus Integration from the asset store/ Oculus web.
Investigate the changes needed to fully support the more general XR integration from Unity

Support other grabbers

Currently the grabber is based on OVRGrabber (a simplified version).
Consider a design that allows easily attaching this to the original OVRGrabber without modifications, the VRTK grabber and others (Unity XR?)

Setup wizard

IT would be benefitial to have a "setup wizard" where you just point at the relevant GameObjects, give a few parameters and it creates all the necessary components and link them.

?use of HandPosing without Oculus Link

Hi! I'm trying to record a hand pose of grabbing a vault dial. I only have the Quest linked to the laptop to build and run the APK. Is there a way to record the pose without the Oculus Link? Also how would I keep the dial fixed and only rotating along the Z axis?
Big thanks

Turning off "Can Snap Back" breaks grabbing in build mode

The grabbed object will simply ignore the grabber. You can recreate it by building the example scene provided in the repo. The cylinder and physics based lever will not work properly. It woks perfectly in play mode, just not when you build. I'm using unity 2019.4 and didn't change anything in the repo.(outside adding oculus integration asset)

Generate samples

Samples illustrate what can be accomplished with the library. Interactions with all supported volumes types, objects with joings, sliding hands, hand mirroring, hand scaling, avatar compatibility, custom grabbers

Private _grabPoints

@MephestoKhaan Good day, sir. Can you please make field _grabPoints in class Grabbable protected instead private? We are using hand posing to improve grab system in our app and we have procedural meshes and for these meshes we need to update grab points. Base system implemented in samples currently efficient for our purposes and create full Grabbable-Grabber implementation will take long time.

Minor suggestions

Here are few minor suggestions that you might find useful:

  • OnGrabAttemp -> OnGrabAttempt (missing final "t"). I've noticed a similar issue BaseGrabber
  • Are the { get; set; } on the Actions for IGrabNotifier necessary?

Hand tracking mostly doesn't work for fingers

Hi, i'm having an issue using Hand-tracking in unity through Oculus Link to setup poses.

Everything works perfectly when using controllers but so far, I've managed to have hand-tracking working only on 2 tests out of a lot.

Generally, my hands remains in "default pose" even though they actually follow my movements ; but the bones don't move at all, and the hand is in a weird rotation. I can still grab the objects and the hand will take the right pose, but as soon as I let go, they go back to there default position.
Also when it is working, I see the skeleton on my hand, but it is not rendered when not working

I managed to record poses the 2 times it was working tho, I just want it to work everytime.
Works fine in android builds also.

I'm using Unity 2019.3.15f1, the last version of oculus intergration, XR interaction Toolkit 0.9.4 and Oculus XR plugin 1.3.4

Grabber/Grabbable interface

Instead of adding our own grabber, create an interface so it is easier for anyone to generate its own system (or plug it to the current one)

Hands Sliding

Hands should be able to freely move around the grip volume if they are being held somewhere else.

Grip recalculation

Grip position is recalculated everytime we swap hands, should be needed just once and not really depend on the swap.

GrabFlexThreshold needed?

Just wondering if GrabFlexThreshold needs to be me IGrabNotifier? I understand that it needs to be used for sliding, but in my opinion the threshold of grabbing starting and stopping should be kept isolated. Maybe there is a way to make this be a normalized value that's provided ... i.e. 0 means not grabbing, 1.0 means grabbing at full strength.

Generate hand animations

Having a rigged hand without animations. We could use the hand tracking to generate the animation states (pointing, doing "OK", horns etc) to feed an Animator.

Fixed objects

Some objects might be fixed in space (for example a door handle, the bars in a ladder, etc). Add support to the Grabber so they do not parent to the hand, and also allows the hand to slide in position

Generate documentation

All aspects of the library are documented, including pause generations, sliding hands, using one's own grabber, joint interactions, hand mirroring, hand scaling, avatar compatibility.

Pose record for scaled objects

image

When I trying to record pose for scaled object, the ghost hand become a scaled too. How can I to record a pose for scaled object without the issue?

Hand Scaling

Support scaling the hands.

Maybe for now we can record the poses with different scales and then lerp between them

Ghost joints editor

Currently we have some mini-rotators to adjust the offsets of the bones, would be better use to manually tweak the fingers in the ghost

Surface transform

Currently the CylinderSurface depends on the Grip of the Ghost to calculate its global position.
Should we use the Grabbable instead?

Consider store poses as animations

Poses maybe could be saved as animations. Typically 1 frame long but we could enable several frames so we could attach it to "use" gestures likes holding a gun and pressing the trigger

Undo hand offset

When using controllers, the hand moves and rotates to the best pose and stays there. If said pose is similar enough to the original hand pose it works very well, but in some cases we might want to, once grabbed, quickly animate the hand back to the original position while grabbing the object.

This already works like that, but without the animation, when using handtracked hands, since it overrides the position apparently.

Joints support

Ensure the physics systems still work with snapping (like levers)

Last commit broke hand tracking

Firstly, it fixed all controller issues to thanks for that! Hand tracking doesn't track in play mode anymore, while it snaps properly. In build mode, it tracks the hand but no longer snaps.

Hands inversion

Inverting the hand direction (up to down) works good now only when it is along a cylinder, not necessarily when using it in for grabbing an edge, since the grip point is not necessarily centered. Is there a way to speed up this "inversion" procress?

Maybe instead of solving it fully, we could draw a secondary ghost and indicate an offset in the data

Support other volumes

Support spheres and maybe throruses.

The general solution for this is really making a bit more lose the scoring system that right now checks the entire quaternion, and the "limit" of the cylinder volume

Decouple Grabbable from BaseGrabber

Grabbable refers to BaseGrabber in its implementation, which can cause issues if other people want to use their own IGrabNotifiers. It might be a good idea for certain items like GrabBegin to use IGrabNotifier instead, however child classes like PhysicsGrabbable call GetComponent on the hand argument passed in, so using GameObject as the argument for GrabBegin can be a good compromise.

Now one issue with trying to decouple Grabbable from BaseGrabber is that the former does call the static function BaseGrabber.ClearAllGrabs in UnsubscriberGrabber. You might not be able to cleanly replace that specific code with an interface...

Hand Physics Issues

Unsure how to configure hand physics properly to avoid fingers moving through objects prior to grabbing.

Have pure "object to hand" snap mode

This one will take a bit of work to figure out. There are two types of grabbing under consideration here:

  1. Object-to-hand: Hand is stationary (i.e. does not reorient) and object orients to the hand upon grabbing. The closest is pose is snapped to, but the hand does not rotate itself to do so because the object moves toward the hand.
  2. Hand-to-object: Object does not reorient itself; the hand puppet simply snaps to a selected pose.

If "snap back" if false on a snap point then hand-to-object is easily satisfied. This is a seen with the poses gallery's ring, for instance.

However if snap back is true, then in the current implementation it appears that both the hand and object move toward each other. This can be seen with the sword in the released sample. I think "snap back" should be an enum that allows the two behaviors above, along with what is currently implemented (which can be a "both object-to-hand and hand-to-object"). But if that requires changing too many assets, maybe there should be another boolean that allows pure "object-to-hand" behavior.

I had to work on other stuff today but tomorrow I can see how this should be implemented. Please let me know if you have any helpful pointers.

Store the raw poses

Currently we are storing the bone rotations as seen in the local puppet. But they might differ from the actual tracking data since they need to be transformed (depending on the model).

Store the raw skeleton data (or infer it from the controller if not using hand-tracking)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.