exokitxr / avatars Goto Github PK
View Code? Open in Web Editor NEWAvatar system for Exokit
Home Page: https://avatars.exokit.org
Avatar system for Exokit
Home Page: https://avatars.exokit.org
The unity loader uses synchronous zip unpacking. We can optimize it to use a web worker.
Would be cleaner to have the legs floating behind the avatar, rather than the static pose we have right now.
When scaling in 1P/3P mode, the hands emulation becomes scrunched/wide, when it should remain the same.
Right now the update rate does not factor in timing, especially for things like walking input updates.
This means that slower framerates mean slower walking and other glitches. We should account for timestamps to advance the simulation.
Hair physics need to be toned down to restitute more against the headpose, or self-intersect, so that they don't get in the way of the player's view.
It's been a common theme that people want to use the avatar system for VTuber style rendering, with the face towards the screen in a capture.
So it might make sense to put in a facecam view on the demo page.
It would be awesome if Enter XR worked on mobile devices too! The app seems to work until Enter XR is selected. It instructs you to use cardboard but nothing appears in landscape mode.
Worth exploring alternative buttons to scale, such as pressing X
on both controllers, since the trigger and grip buttons are both used for various hand gestures.
We already have walking support but it would be good to also have teleport on the demo page.
Add an option to clear out the UI and focus on the canvas.
Unclear whether we want to use the actual fullscreen mode, though we could.
The mirror position does not match across multiplayer players.
Most likely this is due to the fact that we try to make the mirror correctly sized for any given loaded model. Maybe we shouldn't do that?
Just a heads up! Guessing the domain expired.
Models 1, 2, 12, 9 were simultaneously loaded on index and quest by three people, resulting in lingering lag.
Open question: should we drop the model to the floor with gravity, when not scaling?
Sometimes WebRTC doesn't connect between two Quests. This could be due to the connection establishment dance.
It can be interesting to add an option to show/hide legs. While legs aren't tracked, I think that inverse kinematics can't be good enough and will just break the presence.
Add hair collision support, so that hair does not clip through the avatar face and limbs.
The controllers are currently aligned to the wrist bone of models, but this is not quite correct. We should offset ~5 cm to the center of the hand to get the expected wrist position.
When exiting XR and going back in, the peer connection will transmit audio twice.
model16
/model12
are still wrong in terms of wrist alignment. We tried to offset to account for this but it seems wrong.
One idea to try might be to average a midpoint between the wrist and the fingers start.
The camera dolly doesn't work with the non-vr render. Currently the code will only add the dolly when entering VR mode, but this is weird.
When adjusting scale before XR entry, the avatar will appear "third person" and not aligned.
This is most likely due to scene scale.
I think the FBX loader does already support deformers loading, but we probably aren't parsing the visemes at the exokit avatars level:
https://github.com/mrdoob/three.js/blob/dev/examples/js/loaders/FBXLoader.js#L645
Oculus Quest browser update broken all microphone input, so neither voice chat nor visemes will work until that's fixed.
Note that Hubs is broken too.
Exokit Avatars loads .unitypackage
files, but the animations seem separate from the actual FBX models.
If we want emotes, I think we'd need to parse the .anim
YAML files in the pacakges and inject them into the THREE.js animation system, which should be able to handle the mixing. The YAML files look like this:
%YAML 1.1
%TAG !u! tag:unity3d.com,2011:
--- !u!74 &7400000
AnimationClip:
m_ObjectHideFlags: 0
m_PrefabParentObject: {fileID: 0}
m_PrefabInternal: {fileID: 0}
m_Name: Dab
serializedVersion: 6
m_Legacy: 0
m_Compressed: 0
m_UseHighQualityCurve: 0
m_RotationCurves: []
m_CompressedRotationCurves: []
m_EulerCurves: []
m_PositionCurves: []
m_ScaleCurves: []
m_FloatCurves:
- curve:
serializedVersion: 2
m_Curve:
- serializedVersion: 2
time: 0
value: -0.012714
inSlope: 0.17463881
outSlope: 0.17463881
tangentMode: 0
- serializedVersion: 2
time: 0.033333335
value: -0.0068927067
inSlope: 0.25062943
outSlope: 0.25062943
tangentMode: 0
Legs IK was deemed heavy in this screencap, like she's stomping:
https://i.gyazo.com/053390d1c994d5d32a30992fb54943b8.mp4
We can reinvestigate the legs parameters for V2.
Makes it hard to move around in different scales.
Microphone was not working when transmitted from any browser on Windows 10, to an Oculus Quest.
We should also have crouch in addition to prone mode. (c
key)
The hands can go to the sides, like a Naruto run.
Currently that model isn't bound properly in the IK system.
It'd be nice to have snap turn while in VR mode and Q E while in desktop mode.
During a multiplayer test, the multiplayer audio cut out on one end. Pose updates continued.
Hats off! This project is amazing!
How do we disable decapitation mode? It doesn't look right on PCs.
I was trying to run this on the quest 2 but it doesnt work when going into vr mode. It just shows a black screen but when I looked at it through quest 1 headset in vr mode it works fine. Is anyone else having this issue?
Right now the camera doesn't align when both scale and firstperson/thirdperson views are used.
This was reported when small model2 viewed a small model1.
KB/mouse are lagged by a frame, resulting in disjointed head camera one frame of from the avatar eyes.
We should copy what the multiplayer implementation is doing to fix this.
The teleport target is seemingly not accurate when the model is scaled.
It would be awesome if this avatar system could track hand movements without a controller on the ML1! I believe this feature is coming to Oculus in the future as well...
The default runaround posing has glitches ever since the new freeform shoulder rotation was introduced with exokitxr/exokit-browser@60f268a
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.