godotvr / godot-xr-tools Goto Github PK
View Code? Open in Web Editor NEWSupport scenes for AR and VR in Godot
License: MIT License
Support scenes for AR and VR in Godot
License: MIT License
Currently packable objects only support showing highlighted state by setting the texture of a selected mesh to a predefined blue texture.
The highlighting should be expanded to support options such as showing or hiding nodes (such ss target rings).
A solution would be for pickables to emit signals on highlight state change, and having a set of highlighter nodes that can be added as children which auto-subscribe to the event.
Godot version: Godot 3.0.6.stable.official.8314054 (Steam vers)
OS: Win10 1803, x64
HMD: HTC Vive
When trying to teleport around, if the teleport ray collides with a trimesh collision shape (e.g. from Mesh->Create Trimesh Collision Sibling), the game closes, with no error message present in the Godot output.
The vignette shader is set for 'depth_test_disable', but its render priority is set to 0, so virtually all other "No Depth Test" materials will show through it.
This value should probably be set to some "neutral" middle value such that scene-like "no depth" materials can be occluded, and HUD-like "no depth" status can render on top.
Currently to control the PlayerBody collision layers, "Editable Children" must be enabled, and the associated KinematicBody directly edited. Instead the collision settings should be accessible in the PlayerBody node.
There is a control convention where direct or teleport movement is located on the dominant hand along with strafing (which present XR-tools supports) but then the off hand just has snap or smooth turning and can utilize the Y axis up /down for non-movement in-game controls, like quick switching between weapons, or jumping and crouching. This is fairly trivial to hack with the current tools by copying the direct movement scene and script and scraping out anything related to direct movement but it would be good to officially have its own module so people could mix and match, similar to what has now been done with separating out flight movement from direct movement.
My hacked files are here for turn only: https://drive.google.com/drive/folders/1osoVU7tWHyNJwMDtK8jpxOt9aVN4Rsgb?usp=sharing
When using the LeftHand and RightHand assets, the grip button moves the middle, ring, and pinkly fingers; but the trigger does not move the index finger.
This is caused by a naming error in the filters of the blend trees in LeftHandBlendTree.tres and RightHandBlend.tres. The AnimationNodeBlend2 filters has an extra period in the index skeleton name which causes the blend-tree to not include any animation from the index finger branch in the final animation.
In both files the filters should be changed to remove the invalid period in the name as follows:
- filters = [ "Armature_Left/Skeleton:index._distal", "Armature_Left/Skeleton:index._middle", "Armature_Left/Skeleton:index._proximal" ]
+ filters = [ "Armature_Left/Skeleton:index_distal", "Armature_Left/Skeleton:index_middle", "Armature_Left/Skeleton:index_proximal" ]
The Function_Pickup script maintains an average velocity (linear and rotational) for itself, which is essentially the controller.
When an object is dropped, the controllers average linear and rotational velocity is applied to the object. This works only if the picked up object has "Reset Transform On Pickup" property enabled, which snaps the origin (center of mass) of the object to the Function_Pickup location.
If the "Reset Transform On Pickup" property is disabled to emulate precise-location grabbing, the origin (center of mass) of the object may be some distance from the controller, and the controllers linear and rotational velocity are not applicable.
Consider the scenario of throwing a long rod by holding one end, then flicking the wrist and letting go. As the hand was not necessarily moving, the rod will not move as expected.
The correct solution is to track the average velocity (linear and rotational) of the picked up object rather than the controller. In the example above of throwing a rod, the origin (center of mass) of the rod will be moving, and so when the rod is let go it will move with the expected velocity.
Hi, when setting the Function_Pointer
's Active Button
to Vr Action
and set the name of one of my actions (100% defined in the project's Input map) the editor is filled with the error message
The InputMap action "vr_right_trigger" doesn't exist. Did you mean "ui_right"?
The message disappears when I remove the tool
declaration from the script (obviously). But I am not sure if it needs to be a tool
.
When using with the Direct Movement function, the player motion freezes when encountering steps, or slopes greater than around 35 degrees. The player then slides back down until control returns.
This appears to be caused by a RayCast "Tail" used for detecting whether the player is in contact with the ground:
Unfortunately this technique doesn't work with slopes or steps:
I've tried setting up a simple scene where I simply want to trigger an event when clicking on an object.
I've followed the instructions from the README, where I should just implement the _on_pointer_pressed and _on_pointer_released functions on the StaticBody node.
That doesn't seem to do anything though.
I've tried the teleport function and it works fine.
Am I missing something or is it a bug?
I'm using the godot-openxr plugin 1.1.1 and godot-xr-tools 2.2.0.
Thank you!
Hi everyone,
I would like some input from others about something I've come to regret since starting this project and that is its naming conventions.
Most scripts/scenes start with a word identifying its group and then follow by a name that identifies its purpose. So we have Function_Direct_Movement
and Object_pickable
. Not really the best picks.
I want to do a pass and rename all these to more suitable names but obviously this would break existing uses of the plugin. I'll thus warrant a 2.0 release.
How do people feel about this as a change?
The Godot OpenXR library has an issue where the controller.get_joystick_axis(JOY_VR_ANALOG_GRIP) can report values outside of the 0.0 - 1.0 range (observed on Oculus Quest controllers).
This causes a secondary failure in the Godot XR Tools LeftHand.gd and RightHand.gd scripts which aren't robust to out-of-range values. These scripts just multiply the joystick position by 2.5 to produce animation tree positions. The result is that the animation tree positions can be outside of the legal range and the animation doesn't update. As such quickly releasing the grip button can result in the hand remaining partially gripped.
The solution appears to be adding the following line to LeftHand.gd and RightHand.gd after reading the controllers:
grip = clamp(grip, 0.0, 2.5)
It may also be advisable to do the same protection on the trigger axis, just be robust to out-of-range values in the future.
The 4.0-dev
contains the Work in Progress conversion of Godot-XR-Tools to Godot 4.
Changes made to master
should be ported to the 4.0-dev
branch as well as these will deviate elsewise.
If it is possible I would like to suggest adding of XR Button Tool.
I am still learning to code better but I have made simple button code as an example. It works using detection of player Function_pickup node. This concept could be extended to other types of buttons(levers, physic buttons and etc).
`
extends Area
class_name XRButton
func _ready():
connect("area_entered",self,"it_enter")
connect("area_exited",self,"it_leave")
export var interact_range := 0.1
var last_hand : ARVRController = null
signal pressed(hand)
signal depressed(hand)
var menu_btn_state = 0
func _process(_delta):
if is_instance_valid(last_hand):
if global_transform.origin.distance_to(last_hand.global_transform.origin)<=interact_range:
if menu_btn_state!=last_hand.is_button_pressed(15):
if menu_btn_state==0:
emit_signal("pressed",last_hand)
elif menu_btn_state==1:
emit_signal("depressed",last_hand)
menu_btn_state = last_hand.is_button_pressed(15)
func it_enter(body):
print(body)
if body is Function_Pickup:
if body.get_parent() is ARVRController:
last_hand=body.get_parent()
func it_leave(body):
if body==last_hand:
last_hand=null
`
There two lines in the code https://github.com/GodotVR/godot-xr-tools/blob/master/addons/godot-xr-tools/functions/Function_Direct_movement.gd#L216 https://github.com/GodotVR/godot-xr-tools/blob/master/addons/godot-xr-tools/functions/Function_Direct_movement.gd#L180 which say:
origin_node.transform *= t2 * rot * t1
To avoid losing tolerance they should probably read:
origin_node.transform = (origin_node.transform * t2 * rot * t1).orthonormalized()
I had a nasty run-in with this issue at godotengine/godot#43320
Created a simple app using Godot v.3.2.beta1, godot-xr-tools 2.0 and OVRMobile ARVRServer. The app loads fine on the Quest ( v17.0.0.244) , headset tracks but the controllers are frozen, they will rotate when you press the trigger but wont track hand movements.
Regards
The current flight control is implemented as a secondary feature of Function_Direct_movement which prevents adjusting/separating the order of walking and flying controls.
Additionally flight deserves some advanced features, such as:
I found the reference to these scripts in the intro video, but I'm having trouble getting things connected correctly (possibly related to recent updates to 3.0.6 or the noted pending change from a kinematic to a spatial body).
Steps taken:
OVRFirstPerson
node)OVRFirstPerson
node in the Scene tab and select Editable Children
Left_Hand
in the Scene tab and instance child scene Function_Teleport Note: this produces a warning that it has no collision shape but it does not prevent running and adding one seems to have no effectRight_Hand
in the Scene tab and instance child scene Function_Direct_movement Note: it's not clear whether this is a dependency for teleportation or just an example of assigning different functions to each handFunction_Teleport
in the Scene tab and find its Script Variables in the Inspector - assign Origin
to OVRFirstPerson
which is an instance of ARVROrigin
Function_Direct_movement
in the Scene tab and find its Script Variables in the Inspector: assign Origin
to OVRFirstPerson
which is an instance of ARVROrigin
and Camera
to ARVRCamera
Results:
The wind-sensor collision shape in the Function_Wind_movement Movement Provider is located at the players feet, and can get frozen in position if the player performs an exclusive movement operation such as climbing.
The following images show the wind-sensor when walking, then the wind-sensor when climbing:
The cause of the wind-sensor being at the players feet is because the Function_Wind_movement script places the wind-sensor at the PlayerBody/KinematicBody origin, which is not in the middle of the players body. The cause of the wind-sensor not following the player when an exclusive movement is occurring, is due to the exclusive movement skipping lower-priority movement providers, so the moving of the wind-sensor does not occur.
The simplest solution may just be to reparent the wind-sensor to the ARVRCamera.
Current implementation seems to only allow teleportation onto flat plane. Would be nice to allow for slopes so that ramps can be used.
We added hand models that react on inputs a little while ago, should add a help page on this
Pardon what will probably be a rookie mistake! I'm using one Viewport_2D_in_3D. The menu scene is correctly displayed in the godot editor
However when the scene launches the following errors show up, the Viewport node is white and I can't seem to click on anything.
E 0:00:18.356 get_path: Cannot get path of node as it is not in a scene tree.
<C++ Error> Condition "!is_inside_tree()" is true. Returned: NodePath()
<C++ Source> scene/main/node.cpp:1642 @ get_path()
<Stack Trace> entry.gd:17 @ _ready()
E 0:00:18.364 get_node: (Node not found: "" (relative to "").)
<C++ Error> Condition "!node" is true. Returned: __null
<C++ Source> scene/main/node.cpp:1371 @ get_node()
<Stack Trace> entry.gd:17 @ _ready()
E 0:00:18.371 setup_local_to_scene: ViewportTexture: Path to node is invalid.
<C++ Error> Condition "!vpn" is true.
<C++ Source> scene/main/viewport.cpp:69 @ setup_local_to_scene()
<Stack Trace> entry.gd:17 @ _ready()
Here is the code throwing the error, although I suspect the actual problem is deeper in the nest
extends Node
func _ready() -> void:
name = 'entry'
# By default open the client main menu
var spath: String = "res://client/main_client.tscn"
if ("--server" in OS.get_cmdline_args() || OS.has_feature("dedicated_server")):
# Requesting to open in server mode, so change the string to it
spath = "res://server/main_server.tscn"
# Cache that we are indeed in server mode
loader.is_dedicated_server = true
print("Set to server mode")
# And transition into the appropriate scene
# warning-ignore:return_value_discarded
get_tree().change_scene(spath) # Here is line 17
tl;dr: After center_on_hmd()
has been called, it is very difficult to recover the absolute position in the physical space. Avoid calling this in the Teleport function so we don't lose the position in the user's physical space. Move/rotate the ARVROrigin instead.
Long description
When doing room-scale VR games, e.g. with the HTC Vive, it is important to maneuver the player back into the center of their physical space again, to maximize free walking into every direction. For example, the "Superhot VR" game does this by requiring the player to touch an object at the begin of every level.
Also, for the purpose of showing the user the room boundaries (->Chaperone system), knowledge of the physical position is necessary.
For OpenVR (and probably all other such systems), the ARVR's Camera position relative to the ARVROrigin equals the physical room positon, unless center_on_hmd()
is called. Then, the said relative position has no physical equivalent any more.
I'll do that once I've time
It's listed as something to do in the instructions, but here is a reminder.
https://github.com/GodotVR/godot-xr-tools/wiki/Teleport
It seems that you need the KinematicBody so you can access get_world().get_direct_space_state()
which you need to execute intersect_ray()
or collide_shape()
.
The KinematicBody is not happy unless it contains a CollisionShape (which is why there is a warning). Now you create a CapsuleShape collision shape in the _ready()
of the Function_Teleport.gd, so why simply put a CapsuleShape into the KinematicBody and use that in your collide_shape()
function call?
I've tried this (in a function which is supposed to create and place objects where you are pointing, instead of teleporting) and there seems to be a problem: when you set the global_transform on the CollisionShape, it seems to set the global_transform on the KinematicBody whether you like it or not! Is this something you have encountered, which would explain why you needed to make a local collision shape to use in the collide_shape()
function?
Snap turning currently takes some time for the rotation to accumulate before performing the first snap. This is confusing to the user as there is a measurable duration where turning feels broken. Instead the first snap of the turning should be almost instantaneous, and only repeat snaps should be delayed by the snap-turn rate.
The following movement providers have "active" states:
If any of these providers are disabled when "active" then they don't get a chance to update their "active" state or report the end events.
I started the code in this repository years ago without really thinking about a name. It was a common repository of VR related code that applies to any platform supported by Godot.
The name doesn't make a lot of sense to people so I am thinking of renaming this repository to godot-vr-toolkit
. It would be good to get some feedback on this from everyone using this code.
What do you guys think?
The current debug screen is nice and shows player position and such. However for developing on the quest 2 I needed to see what certain variables were at. So I tweaked a text label in a similar setup, and got help from people to use a singleton logic to pass variables to it. Then in game I can see outputs of stuff I am working on.
Overall it would be nice to have some kind of easy to pass variable screen for like a dump of values, or tracking specific values, etc. My current version just scrolls to the bottom when new stuff is added, but it would be interesting to have that when aiming/pointing the joystick on that controller could scroll the window. Or allow for tabs or buttons to select or change values on the fly in game. To prototype and debug issues faster without having to rebuild the game to the quest. If that were to be done it might be having to implement some input prompt or keyboard, but that would get complicated too.
Having some way to pass variables though is mainly what I have been using a tweaked version of this to do, but would be neat to see some other features that people might need for debugging.
This feature should be common. But It doesn't seem to be possible to render directly to viewport via ColorRect. (For example, official demo)
Devices: Oculus Quest 2 (No errors are printed)
The current flight system only supports inputs from a single controller. Instead each controller should be able to contribute thrust and/or strafing.
For me my workflow has often been to start in Godot and build up an environment and then bring in assets from outside. With this approach you tent to start with the player and the player setup thus ended up in Layer 1.
The problem this introduces for others is that importing assets that also contain physics objects default to layer 1 as well resulting in the plugin either seemingly being broken (i.e. player falling through the floor or not being able to move around the scene).
I suggest we move the player into layer 20 and update the defaults of all the object in the plugin accordingly engine up with physics layers such as:
(not final).
This will be a breaking change but I think it will make the plugin easier to use from the start.
Something that came up on the Godot XR discord channel, having a virtual keyboard implementation people can just use would be really handy
The Direct Movement functions should support specifying what the "forward" direction is referenced from.
The Object_pickable script makes use of an optional "PickupCenter" child node, but only when the "reset_transform_on_pickup" is enabled. The _get_configuration_warning only reports a warning for a missing "PickupCenter" if the "reset_transform_on_pickup" flag is enabled.
The _ready() function is written with:
center_pickup_on_node = get_node("PickupCenter")
Unfortunately the get_node reports an error if the node cannot be found. To prevent the error from cluttering the debugger, this could be changed to either:
center_pickup_on_node = get_node_or_null("PickupCenter")
or possibly:
if reset_transform_on_pickup:
center_pickup_on_node = get_node("PickupCenter")
It's more common to allow the joystick to strafe left and right as well as forwards and backwards, and put the snap turn on left-right the opposite controller. How is this system going to be able to capture commands from the other controller?
If the player stands on a slope they will slide down hill. This used to be fixable by setting stop_on_slope=true in the KinematicBody.move_and_slide call, however that has been broken since godot 3.2 so an alternative implementation will be needed.
A common approach is to detect when the player is on the ground and "skew" the gravity in the direction of the ground surface normal.
This is essential for efficient debugging so you can perform quick iterative tests against the parts of the game under development without needing to boot up the headset.
The OQ_Toolkit had this feature on by default when there was no VR interface present and overlayed the control instructions onto the screen.
https://github.com/NeoSpark314/godot_oculus_quest_toolkit/blob/master/OQ_Toolkit/OQ_ARVROrigin/scripts/Feature_VRSimulator.gd
It would be great if the keyboard controls were to map onto the VR controls (eg WASD and Shift-WASD for left and right controller joysticks) so we didn't need to implement motions in two different ways.
Right now, jumping can only be activated with a button press. I think an option should be added so that it can be activated by the player swinging their arms up.
The Function_Direct_movement should emit signals for player_fly_start and player_fly_end. This would allow for enabling flying sound-effects.
Additionally to handle complex control rules (if flying then disable climbing, if climbing then disable flying, etc.) a player interaction manager would need to listen for all movement provider signals, of which flying appears to be the only one missing.
I merged in my vignette solution but right now it needs to be manually hooked up.
This is mostly a reminder for myself to spend some time and add some automatic logic, by checking how much the camera has rotated and moved in global space since the last frame, we can adjust the vignette.
The idea here is to calculate a comfort level between 0 (no movement) and 1.0 (fast movement), where we have settings of how strong the angular and linear movement counts that calculates this level.
If this value is higher than the previous frames value we increase our value, else we decay our value. This means we don't get rapid in/out movement of the vignette but movement of the head will bring it in, and then slowly (say in 0.2 or 0.3 seconds) move it outwards again.
So in pseudo code:
var auto_vignette = 0.0
export var auto_vignette_decay = 0.3 / 1.0
export var auto_vignette_strength = 0.3
func _process(delta):
auto_vignette = clamp(auto_vignette - (delta * auto_vignette_decay), 0.0, 1.0)
var new_vignette = calculate_vignette_on_movement()
if new_vignette > auto_vignette:
auto_vignette = new_vignette
set_radius(1.0 - (auto_vignette * auto_vignette_strength)
There are numerous almost-identical routines to find player resource nodes:
MovementProvider.gd has some of these functions, but they only work if the caller is a MovementProvider. Instead it would be better if these find routines were available to any node under the player - for example by making them static functions of a helper script.
Documentation on the wiki page about this is still blank, time to write it...
For throwing things having the object rotate based on controllers rotation would make it more immersive.
Tried to work out the math for it my self but only found a non optimal way to do it.
A PlayerBody has ground physics settings consisting of both 'drag' and 'traction' properties:
Currently the rules for ground motion are:
If we try to ascribe physical meaning to 'drag' and 'traction' we get:
The descriptions above however are not perfectly correct because we apply 'drag' even if the player is attempting to move. The result of which is the player can't actually move at the 'Max Speed' because the 'drag' always pulls them slightly slower. For example the following screen-shot is the player set up with a 'Max Speed' of 4.0 running as fast as they can:
One fix for this is to change the rules for ground motion as follows:
Doing this simple change results in the player being able to run at the 'Max Speed' and still slowing if the joysticks are neutral:
The Object_pickable script fires a 'picked_up' signal when the object is picked up, but has no corresponding signal for when the object is dropped. The let_go function could be modified to fire a new 'dropped' signal.
It would also be beneficial to add an "action" signal for when the picked object has its action invoked by the associated controller action button.
The movement providers attempt to create a PlayerBody if one does not exist; however the PlayerBody does not appear in the Scene tree-view even though it exists. The result is that:
This appears to be an issue in MovementProvider.gd in _create_player_body_node() and how it sets the owner. Replacing
player_body.set_owner(arvr_origin.owner)
with:
player_body.set_owner(get_tree().get_edited_scene_root())
seems to fix the issue.
Right now most of the features in XR Tools need to be added through subscenes. That makes a lot of sense as for many we need a handful of objects properly configured to make the function usable.
However we also define all classes with class_name
which makes it possible to see them in the node tree and just add the node. If you do this, you get a node of the correct type, with the script attached, but it doesn't use the related scene.
Most of the scenes are really simple with just a handful of objects. It is fully possible to add the needed nodes in the _ready
function of a script and remove the need to have the scenes.
Obviously this is something worth discussing whether we want to do this and to what type of classes to restrict this too. For instance, the scenes meant to be inherited should remain scenes, but we could fully remove the function scenes for instance.
Also this would be a serious breaking change and maybe something we should consider only for Godot 4.
The docs for move_and_slide()
say:
This method should be used in Node._physics_process (or in a method called by Node._physics_process), as it uses the physics step's delta value automatically in calculations. Otherwise, the simulation will run at an incorrect speed.
linear_velocity is the velocity vector (typically meters per second). Unlike in move_and_collide, you should not multiply it by delta -- the physics engine handles applying the velocity.
Unfortunately we have on line 161 of Function_Direct_movement.gd
velocity = controller.global_transform.basis.z.normalized() * -delta * max_speed * ARVRServer.world_scale
velocity = $KinematicBody.move_and_slide(velocity)
Same thing around line 249
velocity = dir.normalized() * -forwards_backwards * delta * max_speed * ARVRServer.world_scale
...
# apply move and slide to our kinematic body
velocity = $KinematicBody.move_and_slide(velocity, Vector3(0.0, 1.0, 0.0))
I did wonder why I had to set the value to 250m/s to get anywhere!
I really don't like this design for the Godot function: delta should be passed in as a parameter to move_and_slide()
instead of it ingesting it some other magical way.
I found this while trying to hack in some kind of climbing/crawling feature where you hook in a spike or a hook from your hand into the environment collision object to pull yourself along without using a trigger button (which is the convention for grab). The challenge is working out how to dehook, which I am hoping to do by pulling while your KineticBody pushes you away.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.