The gravity/shape simulator for VR.
Teleoperation plays a vital role in industries such as medicine and manufacturing, where human operators interact with remote robotic systems. We designed and manufactured a wearable device for enhanced sensitivity, wearability, and synchronization, enabling users to feel the sensation of grasping virtual objects. Two prototypes were tested for force and shape rendering functions. To assess the device's effectiveness, participants ranked the weights of three virtual balls and identified the direction of force applied to a virtual ball in separate experiments.
Our wearable device comprises a 2-DoF robotic arm, haptic sensors for finger gestures, and a Unity-powered virtual scene system. Participants tested the device's weight perception accuracy by ranking three virtual balls with different masses. In another experiment, participants identified the direction of randomly applied forces to a virtual ball. The results showed that 73.3% of participants accurately ranked the balls by mass, and the overall correctness rate for force direction identification was 87.3%. In a third experiment, the device demonstrated varying shape rendering capabilities, more accurately rendering simple objects like spheres, while struggling with more complex objects such as cups and cones. These findings indicate the device's potential for various haptic feedback and virtual reality applications.
For lower level controller, we used the architecture used by ZJUI Meta team. This [link] leads to the code base we are using. The target "haptic_dvc" compiles the firmware we are using to control the hardware.