This repository contains scripts relating to the Task 1 of HANDS2022 challenge organized at ECCV'22.
The goal of this task is to estimate the pose of hand-held objects from a single RGB image. For more details:
- HANDS22 website:
- Challenge Website:
- Dataset Download:
The dataset used in the challenge is a refactored version of HO-3D dataset. We provide scripts for the following in this repo:
- Dataset Visualization Script.
- Challenge Submission Script.
- Object Pose Accuracy Evaluation Script.
- Download MANO model files from the website (requires login) to
smplx_dir
- Download the dataset to
dataset_dir
. - Download the YCB object models by clicking on
The YCB-Video 3D Models
here. Save the models inobject_models_dir
- Install the required modules:
pip install open3d matplotlib argparse opencv-python torch smplx pillow
To visualize the annotations in the train set, run the following:
python vis.py --dataset_dir dataset_dir --smplx_path smplx_dir --object_models_dir object_models_dir
The annotations in 2D and segmentation map are shown in a matplotlib window and the mesh is visualized in an open3d window.
To submit your results to the challenge for automatic evaluation modify challenge_submit.py
to call your
function and dump the json file. The json file has to zipped before submission.
python challenge_submit.py --dataset_dir dataset_dir
We use Maximum Symmetry-aware Surface Distance (MSSD) metric to measure the accuracy of the object pose estimates. Refer to the
challenge website for more description of the metric. The script evaluation.py
is used in the
codalab competition server for evaluating the submissions. The annotations of the test set are withheld.