Git Product home page Git Product logo

orcl_vr_eyetracking's Introduction

ORCL_Sim - A System Architecture for Studying Bicyclist and Pedestrian Physiological Behavior Through Immersive Virtual Environments

Introduction

This repository comes with the code for Tobii Eye Tracking integrated in HTC VIVE Eye Pro in Unity, Which is a part of projects from Omni-Reality and Cognition Lab in University of Virginia (https://engineering.virginia.edu/omni-reality-and-cognition-lab). More details and visualizations of our projects can be found in (http://uvabrainlab.com/portfolio/mobility-and-infrastructure-design/)

ORCL logo

IMPORTANT UPDATE

Tobii Pro SDK In version 1.9 the VR support was deprecated.https://developer.tobiipro.com/unity/unity-getting-started.html

In order to make everything work, you may download our test scene with compatible Tobii Pro SDK version integrated here. If you choose to use our test scene, please skip the Tobii SDK Pro set up steps (other steps are still necessary).

Prerequisite

  1. HTC VIVE Eye Pro with Tobii Eye Tracking system
  2. Unity version 2018.4.16 or 2018.3.14
  3. Python 3.6.3 (Anaconda version recommended)
  4. SteamVR
  5. Finish the Set up for the HTC VIVE Eye Pro
  6. Tobii Pro SDK for your platform
  7. Set up Eye Tracking Software (SR runtime) if needed

The HTC VIVE Eye Pro hardware (headset, controller) is from HTC VIVE, the integrated eye tracker is from Tobii, they have provided multiple ways to get access to the eye tracking data:

  • Tobii Pro SDK: A general SDK for getting eye tracking data. This repository will use Python and Unity only.
  • Tobii XR SDK: SDK for Unity, developed by Tobii too, to get started, follow the steps in this link. Tobii XR requires a analytical license to get the raw data, otherwise, eye tracking can only be used for interactive use.
  • Vive Eye Tracking SDK : SDK for eye tracking from HTC. The forum for it can be found here.

This repository includes sample code and tutorials for Python and Unity API of Tobii Pro SDK only.

Tobii Pro SDK data collection

Website of Tobii Pro SDK: http://developer.tobiipro.com/index.html

You can either use Python API or Unity API to get the eye tracking data.

Python API

Set up Python API as http://developer.tobiipro.com/python/python-getting-started.html.

Then run the TobiiEyeTracking.py in the repository to collect the data externally (not within Unity).

If an eye tracker is successfully found, the data collecting is on going until the key 'q' is pressed (you can also change it to another key in the code). An output .csv data file (name with the start and end time like sample_output) will be exported into the out_dir defined in the code.

output_dir = 'C:/github/ORCL_VR_EyeTracking/Data/EyeTrakcing/TobiiProPython'

Unity SDK (PREFERRED METHODS)

To start with, read the document from Tobii Pro SDK (http://developer.tobiipro.com/unity.html) and download the Tobii Pro SDK for Unity.

  1. Create a new project, or open an existing project, in Unity.

  2. Select Assets > Import Package > Custom Package... from the main menu, or by right-clicking in the Project window.

  3. Browse to the downloaded Tobii Pro SDK, named with TobiiPro.SDK.Unity.Windows.

  4. In the next dialog, select to import all files.

  5. In the project window, Drag and drop the "TobiiPro\VR\Prefabs[VREyeTracker]" prefab into the scene and in the inspector, select 'Subscribe To Gaze'. prefab

  6. (Not required) Drag and drop the "TobiiPro\VR\Prefabs[VRCalibration]" prefab into the scene. Select the [VRCalibration] prefab and in the inspector, select a key to be used to start a calibration.

  7. Drag and drop the "TobiiPro\VR\Prefabs[VRSaveData]" prefab into the scene. Select the [VRSaveData] prefab and in the inspector, select a key to be used to start and stop saving data, select 'Save Data/Save Unity Data/Save Raw Data'.

  8. Save the current project.

  9. Play the scene, the saved XML data can be found in the "Data" folder in the project root. Press the save data key selected earlier to stop and save data.

    More details can be found in the TobiiProVR_readme.txt in this repository.

If a XML data was created without any recorded data, check in the windows system 'Task manager' - 'Services' - 'Tobii Service' to see if it is running or not, try to restart it and collect data again.

2021.05.24 update: The correct experimental order is:

  1. Connect VIVE Pro Eye to your computer(in our case we have wireless connection), open VIVE wireless and SteamVR;
  2. Run room setup, make sure the controllers and headsets are in the right place;
  3. Run eye calibration in HTC VIVE PRO EYE;
  4. Open Unity scenarios;
  5. Start or Restart the SR runtime software right before playing the scene (wait until the small robot icon turns orange as shown in the figure below), this can ensure the data collection is working in case you have empty XML data;

SR runtime status check

  1. Play the scene in Unity;
  2. Stop the scene to see if all the data are collected

Video Recording

Their are two ways to do the video recording: External screen recording or internal Unity Recorder.

Internal Unity Recorder

Different versions of Unity requires different actions.

For Unity 2018, in the asset store of Unity, search for "Unity Recorder", download and import. This is a free library for recording user game.

Unity Recorder

In Unity 2019 or newer, the Unity Recorder can be found in Package Manager. Go to 'Window' - 'Package Manager', click 'advance' - 'show preview packages', find the 'Unity Recorder', install.

image-20200828150410628

After import or install Unity Recorder, select Window > General > Recorder > Recorder Window from the main menu,

After setting the Recorder, press 'START RECORDING', or you may press 'F10' in the keyboard for quick start. The Frame rate is suggested to 24.

Since we have already set for VR eye tracking data saving, the data collecting process will start at the same time. The saved MP4 data can be found in the "Recordings" folder in the project root.

The advantage of this method is that the Unity recorder will record exact every frame in the scene, however, as the frame rate of Unity during game playing is not fixed, but the video has a pre-defined fixed frame rate, it would be difficult to extract the timestamp of the experiment. For example, if the frame rate of Unity Recorder is set to 30Hz, and the actual game frame rate is ~15Hz, then the output video length will be half of the actual length. Up to 2021.1, there has been a valid solution for this issue.

External screen recording

Many software can be used for screen recording, we use OBS studio in our study. In SteamVR, select 'Display VR view', drag and maximize the VR view window to an idle display, then open OBS studio, add this display as a new source, as indicated in the figure below. More settings(canvas size, frequency, file names) can be found in the 'settings' option.obsdemo

The advantage of the this method is that it can integrate different video collection systems(e.g. room cameras) with the same timestamp and frequency as shown with our lab case below.

video_collection

A sample video in youtube about our experiment: Pedestrian crossing using smartphone app

Sample scene

So far, we have already set up everything for data collection. For your convenience, I also upload a sample Unity scene for the whole process, the Google Drive Link to it is here.

Process Eye Tracking Data

Suppose we have XML data collected in "Data" folder in the project root as what I have in the '\Data\EyeTrakcing\TobiiProUnity' folder in the repository, and the videos collected in the "Recordings" folder in the project root as what we have in the '\Data\Video\1.Raw Videos' folder in the repository. The goal of this part is to map the gaze data to the videos.

The three python scripts under 'EyeTrackingProcess' folder provides a workflow of processing eye tracking data.

  • 0.video2pic.py Extract frame images from the videos.
  • 1.ReadingTxtFile.py reads the XML file and reshape it to a more readable .csv file.
  • 2.PlotEyeTrackingOnImgs.py reads the .csv file from last step and try to map them in the corresponding video frames and write the images. For more information about the coordinate systems, please refer to 'Useful tips and hints' on the bottom of this page and this page for more details. Note: the frame rate of the Unity Recorder should be set to 24, otherwise there might be problems for this script (like 30FPS, other frame rate hasn't be tested). If you are using external video recording methos such as OBS studio, you need to crop the video into separate videos first.
  • 3.Img2video.py collects all the images and convert them back to a video with gaze like the sample output video in '\Data\Video\4.Gazevideos_out\movie2020-08-2818h08m.mp4'.

Citation

If you want to explore more details or find the repo is useful, please cite our work https://www.hindawi.com/journals/jat/2022/2750369/, https://arxiv.org/abs/2202.13468 and https://ascelibrary.org/doi/abs/10.1061/9780784483893.161.

  1. Guo, X., Angulo, A., Robartes, E., Chen, T. D. & Heydarian, A. Orclsim: A system architecture for studying bicyclist and pedestrian physiological behavior through immersive virtual environments. J. Adv. Transp. 2022, 2750369, DOI: 10.1155/2022/2750369 (2022)
  2. Guo, X., Robartes, E., Angulo, A., Chen, T. D., & Heydarian, A. (2021). Benchmarking the use of immersive virtual bike simulators for understanding cyclist behaviors. In Computing in Civil Engineering 2021 (pp. 1319-1326).
  3. Guo, X., Tavakoli, A., Robartes, E., Angulo, A., Chen, T. D., & Heydarian, A. (2022). Roadway Design Matters: Variation in Bicyclists' Psycho-Physiological Responses in Different Urban Roadway Designs. arXiv preprint arXiv:2202.13468.
@Article{Guo2022,
author={Guo, Xiang
and Angulo, Austin
and Robartes, Erin
and Chen, T. Donna
and Heydarian, Arsalan},
title={ORCLSim: A System Architecture for Studying Bicyclist and Pedestrian Physiological Behavior through Immersive Virtual Environments},
journal={Journal of Advanced Transportation},
year={2022},
month={Aug},
day={04},
publisher={Hindawi},
volume={2022},
pages={2750369},
issn={0197-6729},
doi={10.1155/2022/2750369},
url={https://doi.org/10.1155/2022/2750369}
}

@incollection{guo2021benchmarking,
  title={Benchmarking the Use of Immersive Virtual Bike Simulators for Understanding Cyclist Behaviors},
  author={Guo, Xiang and Robartes, Erin and Angulo, Austin and Chen, T Donna and Heydarian, Arsalan},
  booktitle={Computing in Civil Engineering 2021},
  pages={1319--1326}
}

@article{guo2022roadway,
  title={Roadway Design Matters: Variation in Bicyclists' Psycho-Physiological Responses in Different Urban Roadway Designs},
  author={Guo, Xiang and Tavakoli, Arash and Robartes, Erin and Angulo, Austin and Chen, T Donna and Heydarian, Arsalan},
  journal={arXiv preprint arXiv:2202.13468},
  year={2022}
}

orcl_vr_eyetracking's People

Contributors

xiangguo1992 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

orcl_vr_eyetracking's Issues

xml file still empty

Hello, thank you for the work.
I followed the exact steps for Unity, but the xml file is still empty. Do you have any idea what might be the reason?

Issue 1 - deploy

Thanks for your sharing! Now I am using HTC vive eye pro and want to retrieve data in real-time watching VR videos( connected to Windows and based on steamVR platform). Would you mind telling me how to run the program - on Windows or integrate it into the HMD? and what if I only choose python as the language only?

Tobii Pro SDK for VR

Dear author
Hello.
Recently, we are trying to do a VIMS related research. We are referring to your project to obtain VR eye movement data. Could you please share Tobii Pro SDK 1.8.0 again? Your previous Google Drive connection cannot be accessed
Thank you very much, I wish you a successful scientific research!

Cheng Bishen from CUMT
[email protected]

Empty XML

Hello,

I am fairly new to unity and VR technology, so any help will be greatly appreciated!

I've followed all your steps and also steps of installing SR_runtime, steamVR and VR prefabs in Unity. Everything seems to work ( I successfully connected unity to my headset so that playing the scene syncs well to the headset), but, my saved XML file is empty.

I am currently using SteamVR's [CameraRig] for the main camera in the study (see photo).

3

2

Screenshot 2023-02-16 155300

Thanks in advance!!

Can't access raw data due to license error

Hi,

I greatly appreciate your effort and this repo has helped me a lot. Sadly, even when following all the steps, I can not get the eyetracking raw data. I receive an empty CSV when using python and an empty XML from unity.

I am getting the following error from python:
Subscribing to gaze data for eye tracker with serial number .
{'system_time_stamp': 3635589902, 'error': 'stream_error_insufficient_license', 'source': 'stream_error_source_subscription_hmd_gaze_data', 'message': 'HMD Gaze Data Subscription: Insufficient License\nEye Tracker Address: tobii-prp://
\nEye Tracker Serial Number: **\nError: TOBII_PRO_STATUS_SE_INSUFFICIENT_LICENSE\nMessage: Failed to subscribe to HMD gaze data.'}

Have you encountered such an error while using the tobii pro sdk?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.