Git Product home page Git Product logo

auw-gcn's Introduction

AUW-GCN-for-ME-Spotting

PyTorch implementation for the paper "AU-aware graph convolutional network for Macro- and Micro-expression spotting" (ICME-2023, Poster): IEEE version, arXiv version.

model_overview

The code is modified from USTC_ME_Spotting .

Results

We compare our method against others on two benchmark datasets, i.e., CAS(ME)2 and SAMM-LV in terms of F1-Score:

model_results

Experiment environment

OS: Ubuntu 20.04.4 LTS

Python: 3.8

Pytorch: 1.10.1

CUDA: 10.2, cudnn: 7.6.5

GPU: NVIDIA GeForce RTX 2080 Ti

Getting started

  1. Clone this repository
$ git clone [email protected]:xjtupanda/AUW-GCN.git
$ cd AUW-GCN
  1. Prepare environment
$ conda create -n env_name python=3.8
$ conda activate env_name
$ pip install -r requirements.txt
  1. Download features

For the features of SAMM-LV and CAS(ME)^2 datasets, please download features.tar.gz (Modified from USTC_ME_Spotting#features-and-config-file) and extract it:

$ tar -xf features.tar.gz -C dir_to_save_feature

After downloading the feature files, the variables of feature path, segment_feat_root, in config.yaml should be modified accordingly.

  1. Training and Inference

Set SUB_LIST, OUTPUT (dir for saving ckpts, log and results) and DATASET ( ["samm" | "cas(me)^2"] ) in pipeline.sh, then run:

$ bash pipeline.sh

We also provide ckpts, logs, etc. to reproduce the results in the paper, please download ckpt.tar.gz.

Designing your own adjacency matrix (Optional)

Check make_coc_matrix.py.

Feature Extraction (Optional)

This part of the code is in ./feature_extraction

  1. Download model checkpoints checkpoint.zip, extract it to the feature_extraction dir and move the feature_extraction/checkpoint/Resnet50_Final.pth file to the feature_extraction/retinaface dir
  2. Set path and other settings in config.yaml
  3. Run new_all.py

Special credit to whcold as this part of the code is mainly written by him.

Citation

If you feel this project helpful to your research, please cite our work.

@inproceedings{yin2023aware,
  title={AU-aware graph convolutional network for Macro- and Micro-expression spotting},
  author={Yin, Shukang and Wu, Shiwei and Xu, Tong and Liu, Shifeng and Zhao, Sirui and Chen, Enhong},
  booktitle={2023 IEEE International Conference on Multimedia and Expo (ICME)},
  pages={228--233},
  year={2023},
  organization={IEEE}
}
You may open an issue or email me at [email protected] if you have any inquiries or issues.

auw-gcn's People

Contributors

dependabot[bot] avatar xjtupanda avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.