We propose a garment generation model, which can support most gar-ment topologies and patterns, human body shapes and sizes, and garment materials.
GAN-based Garment Generation Using Sewing Pattern Images
Yu Shen, Junbang Liang, Ming C. Lin.
University of Maryland, College Park
In ECCV 2020.
- Our results
- Linux
- Python 2 or 3
- NVIDIA GPU (11G memory or larger) + CUDA cuDNN
Python: dominate, geomloss, OpenCV, PyTorch C++: OpenCV CMake
- We publish a garment dataset here (122G).
Folder name format: C[ID1]M[ID2]H[ID3], where ID1 represents 3-digit cloth id, ID2 represents 2-digit material id, ID3 represents 2-digit human body sequence id, e.g., C050M00H03.
In each folder, [ID4]_00.obj is garment model file, where ID4 represents garment model id in this sequence, while obs[ID5]_00.obj is human model file, where ID5 represents human model id in this sequence. Also, gt.pkl contains the original human pose and shape information, and we provide python script to extract them in the next step.
The raw dataset (including sewing patterns) we used came from multiple sources, and some of them are public available:
- Dataset from the Research Group led by Prof. Huamin Wang (Ohio State University): here
- Dataset from the Research Group led by Prof. Alla Sheffer (University of British Columbia): here, here
python extract_pose_shape.py [dataset path] [output path]
Sample:
python extract_pose_shape.py all_data/garment_dataset all_data/train_data
Compile reconstruction/new_enc.cpp with provided CMakeList.txt
cd reconstruction
mkdir build
cd build
cmake ../
make
You will get an executable file "ProcessMesh". Then run bash file
cd ../../script/batch_multithread.sh
batch batch_multithread.sh
Remember to modify the folder path in the bash file.
Check options/base_options.py e.g., --name, --dataroot Check options/train_options.py Check options/test_options.py
Notice the files should be aligned in folders train_pose, train_shape, train_legal_map, train_displacement_map
python train.py
After training, see checkpoints in ./checkpoints/[name]
Set test_legal_map, test_pose, and test_shape. Keep reconstruct_dp_map.
python test.py
First create reconstruct_mesh under your target folder. Then
cd reconstruction
mkdir build
cd build
cmake ../
make
./ProcessMesh 1 ../../results/[name]/test_latest/images/ [training data path]/reconstruct_dp_map/ [training data path]/test_legal_map/ [output path] 0 7
Sample:
./ProcessMesh 1 ../../results/garments/test_latest/images/ ../../all_data/train_data/reconstruct_dp_map/ ../../all_data/train_data/test_legal_map/ ../../all_data/train_data/ 0 7
Then you can find the new garments in [output path]/reconstruct_mesh folder.
If you find this useful for your research, please use the following.
@inproceedings{shen2020garmentgeneration,
title={GAN-based Garment Generation Using Sewing Pattern Images},
author={Yu Shen and Junbang Liang and Ming C. Lin},
booktitle={Proceedings of the European Conference on Computer Vision (ECCV)},
year={2020}
}
We would like to thank the Research Groups led by Prof. Eitan Grinspun (Columbia University), Prof. Alla Sheffer (University of British Columbia), and Prof. Huamin Wang (Ohio State University) for sharing their design patterns datasets for the benchmarking and demonstration in this paper.
This code borrows heavily from NVIDIA/pix2pixHD. Also, Junbang Liang made great contribution in this project.