ac5113 commited on
Commit
0853d1a
β€’
1 Parent(s): 99a05f0

updated readme

Browse files
Files changed (1) hide show
  1. README.md +35 -136
README.md CHANGED
@@ -1,125 +1,38 @@
1
- # DECO: Dense Estimation of 3D Human-Scene Contact in the Wild [ICCV 2023 (Oral)]
2
-
3
- > Code repository for the paper:
4
- > [**DECO: Dense Estimation of 3D Human-Scene Contact in the Wild**](https://openaccess.thecvf.com/content/ICCV2023/html/Tripathi_DECO_Dense_Estimation_of_3D_Human-Scene_Contact_In_The_Wild_ICCV_2023_paper.html)
5
- > [Shashank Tripathi](https://sha2nkt.github.io/), [Agniv Chatterjee](https://ac5113.github.io/), [Jean-Claude Passy](https://is.mpg.de/person/jpassy), [Hongwei Yi](https://xyyhw.top/), [Dimitrios Tzionas](https://ps.is.mpg.de/person/dtzionas), [Michael J. Black](https://ps.is.mpg.de/person/black)<br />
6
- > *IEEE International Conference on Computer Vision (ICCV), 2023*
7
-
8
- [![arXiv](https://img.shields.io/badge/arXiv-2309.15273-00ff00.svg)](https://arxiv.org/abs/2309.15273) [![Website shields.io](https://img.shields.io/website-up-down-green-red/http/shields.io.svg)](https://deco.is.tue.mpg.de/) [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)]() [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)]()
9
-
10
- ![teaser](assets/teaser.png)
11
-
12
- [[Project Page](https://deco.is.tue.mpg.de)] [[Paper](https://arxiv.org/abs/2309.15273)] [[Video](https://www.youtube.com/watch?v=o7MLobqAFTQ)] [[Poster](https://www.dropbox.com/scl/fi/kvhpfnkvga2pt19ayko8u/ICCV2023_DECO_Poster_v2.pptx?rlkey=ihbf3fi6u9j0ha9x1gfk2cwd0&dl=0)] [[License](https://deco.is.tue.mpg.de/license.html)] [[Contact](mailto:deco@tue.mpg.de)]
13
-
14
- ## Installation and Setup
15
- 1. First, clone the repo. Then, we recommend creating a clean [conda](https://docs.conda.io/) environment, activating it and installing torch and torchvision, as follows:
16
- ```shell
17
- git clone https://github.com/sha2nkt/deco.git
18
- cd deco
19
- conda create -n deco python=3.9 -y
20
- conda activate deco
21
- pip install torch==1.13.0+cu117 torchvision==0.14.0+cu117 --extra-index-url https://download.pytorch.org/whl/cu117
 
 
 
 
 
 
 
 
 
 
 
 
 
22
  ```
23
- Please adjust the CUDA version as required.
24
-
25
- 2. Install PyTorch3D from source. Users may also refer to [PyTorch3D-install](https://github.com/facebookresearch/pytorch3d/blob/main/INSTALL.md) for more details.
26
- However, our tests show that installing using ``conda`` sometimes runs into dependency conflicts.
27
- Hence, users may alternatively install Pytorch3D from source following the steps below.
28
- ```shell
29
- git clone https://github.com/facebookresearch/pytorch3d.git
30
- cd pytorch3d
31
- pip install .
32
- cd ..
33
- ```
34
-
35
- 3. Install the other dependancies and download the required data.
36
- ```bash
37
- pip install -r requirements.txt
38
- sh fetch_data.sh
39
- ```
40
-
41
- 4. Please download [SMPL](https://smpl.is.tue.mpg.de/) (version 1.1.0) and [SMPL-X](https://smpl-x.is.tue.mpg.de/) (v1.1) files into the data folder. Please rename the SMPL files to ```SMPL_FEMALE.pkl```, ```SMPL_MALE.pkl``` and ```SMPL_NEUTRAL.pkl```. The directory structure for the ```data``` folder has been elaborated below:
42
-
43
- ```
44
- β”œβ”€β”€ preprocess
45
- β”œβ”€β”€ smpl
46
- β”‚ β”œβ”€β”€ SMPL_FEMALE.pkl
47
- β”‚ β”œβ”€β”€ SMPL_MALE.pkl
48
- β”‚ β”œβ”€β”€ SMPL_NEUTRAL.pkl
49
- β”‚ β”œβ”€β”€ smpl_neutral_geodesic_dist.npy
50
- β”‚ β”œβ”€β”€ smpl_neutral_tpose.ply
51
- β”‚ β”œβ”€β”€ smplpix_vertex_colors.npy
52
- β”œβ”€β”€ smplx
53
- β”‚ β”œβ”€β”€ SMPLX_FEMALE.npz
54
- β”‚ β”œβ”€β”€ SMPLX_FEMALE.pkl
55
- β”‚ β”œβ”€β”€ SMPLX_MALE.npz
56
- β”‚ β”œβ”€β”€ SMPLX_MALE.pkl
57
- β”‚ β”œβ”€β”€ SMPLX_NEUTRAL.npz
58
- β”‚ β”œβ”€β”€ SMPLX_NEUTRAL.pkl
59
- β”‚ β”œβ”€β”€ smplx_neutral_tpose.ply
60
- β”œβ”€β”€ weights
61
- β”‚ β”œβ”€β”€ pose_hrnet_w32_256x192.pth
62
- β”œβ”€β”€ J_regressor_extra.npy
63
- β”œβ”€β”€ base_dataset.py
64
- β”œβ”€β”€ mixed_dataset.py
65
- β”œβ”€β”€ smpl_partSegmentation_mapping.pkl
66
- β”œβ”€β”€ smpl_vert_segmentation.json
67
- └── smplx_vert_segmentation.json
68
- ```
69
-
70
- ## Run demo on images
71
- The following command will run DECO on all images in the specified `--img_src`, and save rendering and colored mesh in `--out_dir`. The `--model_path` flag is used to specify the specific checkpoint being used. Additionally, the base mesh color and the color of predicted contact annotation can be specified using the `--mesh_colour` and `--annot_colour` flags respectively.
72
- ```bash
73
- python inference.py \
74
- --img_src example_images \
75
- --out_dir demo_out
76
- ```
77
-
78
- ## Training and Evaluation
79
-
80
- We release 3 versions of the DECO model:
81
- <ol>
82
- <li> DECO-HRNet (<em> Best performing model </em>) </li>
83
- <li> DECO-HRNet w/o context branches </li>
84
- <li> DECO-Swin </li>
85
- </ol>
86
-
87
- All the checkpoints have been downloaded to ```checkpoints```.
88
- However, please note that versions 2 and 3 have been trained solely on the RICH dataset. <br>
89
- We recommend using the first DECO version.
90
-
91
- The dataset npz files have been downloaded to ```datasets/Release_Datasets```. Please download the actual DAMON data and place them in ```datasets``` following the instructions given.
92
-
93
- ### Evaluation
94
- To run evaluation on the DAMON dataset, please run the following command:
95
-
96
- ```bash
97
- python tester.py --cfg configs/cfg_test.yml
98
- ```
99
-
100
- ### Training
101
- The config provided (```cfg_train.yml```) is set to train and evaluate on all three datasets: DAMON, RICH and PROX. To change this, please change the value of the key ```TRAINING.DATASETS``` and ```VALIDATION.DATASETS``` in the config (please also change ```TRAINING.DATASET_MIX_PDF``` as required). <br>
102
- Also, the best checkpoint is stored by default at ```checkpoints/Other_Checkpoints```.
103
- Please run the following command to start training of the DECO model:
104
-
105
- ```bash
106
- python train.py --cfg configs/cfg_train.yml
107
- ```
108
-
109
- ### Training on custom datasets
110
-
111
- To train on other datasets, please follow these steps:
112
- 1. Please create an npz of the dataset, following the structure of the datasets in ```datasets/Release_Datasets``` with the corresponding keys and values.
113
- 2. Please create scene segmentation maps, if not available. We have used [Mask2Former](https://github.com/facebookresearch/Mask2Former) in our work.
114
- 3. For creating the part segmentation maps, this [sample script](https://github.com/sha2nkt/deco/blob/main/scripts/datascripts/get_part_seg_mask.py) can be referred to.
115
- 4. Add the dataset name(s) to ```train.py``` ([these lines](https://github.com/sha2nkt/deco/blob/d5233ecfad1f51b71a50a78c0751420067e82c02/train.py#L83)), ```tester.py``` ([these lines](https://github.com/sha2nkt/deco/blob/d5233ecfad1f51b71a50a78c0751420067e82c02/tester.py#L51)) and ```data/mixed_dataset.py``` ([these lines](https://github.com/sha2nkt/deco/blob/d5233ecfad1f51b71a50a78c0751420067e82c02/data/mixed_dataset.py#L17)), according to the body model being used (SMPL/SMPL-X)
116
- 5. Add the path(s) to the dataset npz(s) to ```common/constants.py``` ([these lines](https://github.com/sha2nkt/deco/blob/d5233ecfad1f51b71a50a78c0751420067e82c02/common/constants.py#L19)).
117
- 6. Finally, change ```TRAINING.DATASETS``` and ```VALIDATION.DATASETS``` in the config file and you're good to go!
118
-
119
- ## Citing
120
- If you find this code useful for your research, please consider citing the following paper:
121
-
122
- ```bibtex
123
  @InProceedings{Tripathi_2023_ICCV,
124
  author = {Tripathi, Shashank and Chatterjee, Agniv and Passy, Jean-Claude and Yi, Hongwei and Tzionas, Dimitrios and Black, Michael J.},
125
  title = {DECO: Dense Estimation of 3D Human-Scene Contact In The Wild},
@@ -128,18 +41,4 @@ If you find this code useful for your research, please consider citing the follo
128
  year = {2023},
129
  pages = {8001-8013}
130
  }
131
- ```
132
-
133
- ### License
134
-
135
- See [LICENSE](LICENSE).
136
-
137
- ### Acknowledgments
138
-
139
- We sincerely thank Alpar Cseke for his contributions to DAMON data collection and PHOSA evaluations, Sai K. Dwivedi for facilitating PROX downstream experiments, Xianghui Xie for his generous help with CHORE evaluations, Lea Muller for her help in initiating the contact annotation tool, Chun-Hao P. Huang for RICH discussions and Yixin Chen for details about the HOT paper. We are grateful to Mengqin Xue and Zhenyu Lou for their collaboration in BEHAVE evaluations, Joachim Tesch and Nikos Athanasiou for insightful visualization advice, and Tsvetelina Alexiadis for valuable data collection guidance. Their invaluable contributions enriched this research significantly. We also thank Benjamin Pellkofer for help with the website and IT support. This work was funded by the International Max Planck Research School for Intelligent Systems (IMPRS-IS).
140
-
141
- ### Contact
142
-
143
- For technical questions, please create an issue. For other questions, please contact `deco@tue.mpg.de`.
144
-
145
- For commercial licensing, please contact `ps-licensing@tue.mpg.de`.
 
1
+ ---
2
+ title: DECO: Dense Estimation of 3D Human-Scene Contact in the Wild
3
+ metaTitle: DECO
4
+ emoji: 🀼
5
+ colorFrom: green
6
+ colorTo: pink
7
+ sdk: gradio
8
+ sdk_version: 3.27.0
9
+ app_file: app.py
10
+ pinned: true
11
+ python_version: 3.9
12
+ ---
13
+
14
+ ### DECO: Dense Estimation of 3D Human-Scene Contact in the Wild (ICCV 2023, Oral)
15
+
16
+ <table>
17
+ <th width="20%">
18
+ <ul>
19
+ <li><strong>Homepage</strong> <a href="https://deco.is.tue.mpg.de/">deco.is.tue.mpg.de</a></li>
20
+ <li><strong>Code</strong> <a href="https://github.com/sha2nkt/deco">sha2nkt/deco</a></li>
21
+ <li><strong>Paper</strong> <a href="https://arxiv.org/abs/2309.15273">arXiv</a>
22
+ </ul>
23
+ <br>
24
+ <ul>
25
+ <li><strong>Colab Notebook</strong> <a href=''><img style="display: inline-block;" src='https://colab.research.google.com/assets/colab-badge.svg' alt='Google Colab'></a></li>
26
+ </ul>
27
+ <br>
28
+ </th>
29
+ <th width="40%">
30
+ <iframe width="560" height="315" src="https://www.youtube.com/watch?v=o7MLobqAFTQ" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
31
+ </th>
32
+ </table>
33
+
34
+ #### Citation
35
  ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
36
  @InProceedings{Tripathi_2023_ICCV,
37
  author = {Tripathi, Shashank and Chatterjee, Agniv and Passy, Jean-Claude and Yi, Hongwei and Tzionas, Dimitrios and Black, Michael J.},
38
  title = {DECO: Dense Estimation of 3D Human-Scene Contact In The Wild},
 
41
  year = {2023},
42
  pages = {8001-8013}
43
  }
44
+ ```