mephestokhaan / handposing Goto Github PK
View Code? Open in Web Editor NEWPose authoring using handtrackingon Quest
License: MIT License
Pose authoring using handtrackingon Quest
License: MIT License
Change the collision system so when generating a snap pose, the fingers automatically rotate back a bit in case they are intersecting a surface.
This would save a lot of time in doing adjustments when the tracking is not very stable.
Similar to #8 but changing from left and right hand. Is there a way to automate ambidextrous poses? This way we could just take a snapshot with the right hand and support also the left.
Right now we are using the raw angles of the hand tracking, but there is an argument in favour of using UK so we can generate poses where the fingertips align in the model and tracked data.
Most probable a more complex task than the others.
Some objects should be grabbed with two hands. At the moment it just swaps from one to another
When defining the bone rotations and assignments from a tracked hand to a "rigged hand". Try to find a way to automate the whole process, maybe posing your ghost-tracked-hand over the controllable hand and match the closest transforms would be a great start.
Oculus XR Plug In v1.10
"Oculus Utilities Plugin with OpenXR is being used, which is under experimental status"
This upgrade disables hand tracking and controller use of the trigger button. So sad!
OVR is used at the moment for its OVRSkeleton capabilities. Find a way to decouple it so other hand tracking solutions can be added such as Leap Motion or Hololens
Some poses could free-up some bones, like leaving the pinky or thumb still tracked so it can be used for something else (like move it grabbing a joystick)
Hi there, has I had to update my project to Unity 2020 as a requirement for an asset I am using.
Some of the required packages seem to have been removed from the 2020 package manager:
* Oculus Android (2.38.6)
* Oculus Desktop (2.38.4)
* OpenVR Desktop (2.0.5)
* XR Legacy Input Helpers (1.3.11)
- has this repo been callibrated to work in URP for Unity 2020.3.2f1 ?
Create a scene that illustrates all possible configurations
Hi again. After reading you wiki regarding snapping behaviours, I was trying to find the CAnMove option you talk about so I can leave the Grabbable dial in the same place when I grab it. I can't find it anywhere. Any tips?
Kind regards
J.
At the moment this works directly with Oculus Integration from the asset store/ Oculus web.
Investigate the changes needed to fully support the more general XR integration from Unity
Currently the grabber is based on OVRGrabber (a simplified version).
Consider a design that allows easily attaching this to the original OVRGrabber without modifications, the VRTK grabber and others (Unity XR?)
IT would be benefitial to have a "setup wizard" where you just point at the relevant GameObjects, give a few parameters and it creates all the necessary components and link them.
I'm wondering if that's expected behavior? If PosingIDsToFinger
doesn't have the value in dictionary, should it return false or am I reading that wrong?
Hi! I'm trying to record a hand pose of grabbing a vault dial. I only have the Quest linked to the laptop to build and run the APK. Is there a way to record the pose without the Oculus Link? Also how would I keep the dial fixed and only rotating along the Z axis?
Big thanks
The grabbed object will simply ignore the grabber. You can recreate it by building the example scene provided in the repo. The cylinder and physics based lever will not work properly. It woks perfectly in play mode, just not when you build. I'm using unity 2019.4 and didn't change anything in the repo.(outside adding oculus integration asset)
Samples illustrate what can be accomplished with the library. Interactions with all supported volumes types, objects with joings, sliding hands, hand mirroring, hand scaling, avatar compatibility, custom grabbers
Bundle the project as an unity package for easier delivery.
@MephestoKhaan Good day, sir. Can you please make field _grabPoints
in class Grabbable
protected
instead private
? We are using hand posing to improve grab system in our app and we have procedural meshes and for these meshes we need to update grab points. Base system implemented in samples currently efficient for our purposes and create full Grabbable-Grabber implementation will take long time.
Here are few minor suggestions that you might find useful:
OnGrabAttemp
-> OnGrabAttempt
(missing final "t"). I've noticed a similar issue BaseGrabber
{ get; set; }
on the Action
s for IGrabNotifier
necessary?Hi, i'm having an issue using Hand-tracking in unity through Oculus Link to setup poses.
Everything works perfectly when using controllers but so far, I've managed to have hand-tracking working only on 2 tests out of a lot.
Generally, my hands remains in "default pose" even though they actually follow my movements ; but the bones don't move at all, and the hand is in a weird rotation. I can still grab the objects and the hand will take the right pose, but as soon as I let go, they go back to there default position.
Also when it is working, I see the skeleton on my hand, but it is not rendered when not working
I managed to record poses the 2 times it was working tho, I just want it to work everytime.
Works fine in android builds also.
I'm using Unity 2019.3.15f1, the last version of oculus intergration, XR interaction Toolkit 0.9.4 and Oculus XR plugin 1.3.4
Instead of adding our own grabber, create an interface so it is easier for anyone to generate its own system (or plug it to the current one)
Hands should be able to freely move around the grip volume if they are being held somewhere else.
Grip position is recalculated everytime we swap hands, should be needed just once and not really depend on the swap.
Just wondering if GrabFlexThreshold needs to be me IGrabNotifier? I understand that it needs to be used for sliding, but in my opinion the threshold of grabbing starting and stopping should be kept isolated. Maybe there is a way to make this be a normalized value that's provided ... i.e. 0 means not grabbing, 1.0 means grabbing at full strength.
Can we convert from our poses to Oculus Poses?
Having a rigged hand without animations. We could use the hand tracking to generate the animation states (pointing, doing "OK", horns etc) to feed an Animator.
Some objects might be fixed in space (for example a door handle, the bars in a ladder, etc). Add support to the Grabber so they do not parent to the hand, and also allows the hand to slide in position
All aspects of the library are documented, including pause generations, sliding hands, using one's own grabber, joint interactions, hand mirroring, hand scaling, avatar compatibility.
Support scaling the hands.
Maybe for now we can record the poses with different scales and then lerp between them
Currently we have some mini-rotators to adjust the offsets of the bones, would be better use to manually tweak the fingers in the ghost
Currently the CylinderSurface depends on the Grip of the Ghost to calculate its global position.
Should we use the Grabbable instead?
Poses maybe could be saved as animations. Typically 1 frame long but we could enable several frames so we could attach it to "use" gestures likes holding a gun and pressing the trigger
When using controllers, the hand moves and rotates to the best pose and stays there. If said pose is similar enough to the original hand pose it works very well, but in some cases we might want to, once grabbed, quickly animate the hand back to the original position while grabbing the object.
This already works like that, but without the animation, when using handtracked hands, since it overrides the position apparently.
Ensure the physics systems still work with snapping (like levers)
Firstly, it fixed all controller issues to thanks for that! Hand tracking doesn't track in play mode anymore, while it snaps properly. In build mode, it tracks the hand but no longer snaps.
The Pose class from Unity reads way better than (Vector3,Quaternion) tuple
Inverting the hand direction (up to down) works good now only when it is along a cylinder, not necessarily when using it in for grabbing an edge, since the grip point is not necessarily centered. Is there a way to speed up this "inversion" procress?
Maybe instead of solving it fully, we could draw a secondary ghost and indicate an offset in the data
Deliver on Asset Store as long as Package for discoverability
Support spheres and maybe throruses.
The general solution for this is really making a bit more lose the scoring system that right now checks the entire quaternion, and the "limit" of the cylinder volume
Grabbable refers to BaseGrabber in its implementation, which can cause issues if other people want to use their own IGrabNotifiers. It might be a good idea for certain items like GrabBegin to use IGrabNotifier instead, however child classes like PhysicsGrabbable call GetComponent on the hand argument passed in, so using GameObject as the argument for GrabBegin can be a good compromise.
Now one issue with trying to decouple Grabbable from BaseGrabber is that the former does call the static function BaseGrabber.ClearAllGrabs in UnsubscriberGrabber. You might not be able to cleanly replace that specific code with an interface...
Unsure how to configure hand physics properly to avoid fingers moving through objects prior to grabbing.
High quality 3d assets to support samples
This one will take a bit of work to figure out. There are two types of grabbing under consideration here:
If "snap back" if false on a snap point then hand-to-object is easily satisfied. This is a seen with the poses gallery's ring, for instance.
However if snap back is true, then in the current implementation it appears that both the hand and object move toward each other. This can be seen with the sword in the released sample. I think "snap back" should be an enum that allows the two behaviors above, along with what is currently implemented (which can be a "both object-to-hand and hand-to-object"). But if that requires changing too many assets, maybe there should be another boolean that allows pure "object-to-hand" behavior.
I had to work on other stuff today but tomorrow I can see how this should be implemented. Please let me know if you have any helpful pointers.
When snapping using a Joint, the code retargets the hand too late forcing a sluggish movement in the joint.
Currently we are storing the bone rotations as seen in the local puppet. But they might differ from the actual tracking data since they need to be transformed (depending on the model).
Store the raw skeleton data (or infer it from the controller if not using hand-tracking)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.