File size: 2,745 Bytes
c747857
 
 
5d0aa91
 
 
 
428aa51
5d0aa91
 
 
 
 
 
d304cbc
 
 
 
 
 
 
5d0aa91
 
 
 
d304cbc
 
 
 
 
 
 
5d0aa91
 
 
 
4d22f42
5d0aa91
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
---
license: openrail
---
Pretrained models of our method **MultiAugs**

Title: *Boosting Semi-Supervised 2D Human Pose Estimation by Revisiting Data Augmentation and Consistency Training*

Paper link: https://arxiv.org/abs/2402.11566

Code link: https://github.com/hnuzhy/MultiAugs

**COCO1K / COCO5K / COCO10K**

*trained on `partly labeled (1k, 5k or 10k) COCO train-set` and `left unlabeled COCO train-set`*
* **ResNet-18 (Pose_Cons using single network)(256x192, COCO1K, 30 epochs)**: [pose_cons_18-COCO1K_e30-model_best.pth.tar](./pose_cons_18-COCO1K_e30-model_best.pth.tar)
* **ResNet-18 (Pose_Cons using single network)(256x192, COCO5K, 70 epochs)**: [pose_cons_18-COCO5K_e70-model_best.pth.tar](./pose_cons_18-COCO5K_e70-model_best.pth.tar)
* **ResNet-18 (Pose_Cons using single network)(256x192, COCO10K, 100 epochs)**: [pose_cons_18-COCO10K_e100-model_best.pth.tar](./pose_cons_18-COCO10K_e100-model_best.pth.tar)
* **ResNet-18 (Pose_Dual using dual networks)(256x192, COCO1K, 30 epochs)**: [pose_dual_18-COCO1K_e30-model_best.pth.tar](./pose_dual_18-COCO1K_e30-model_best.pth.tar)
* **ResNet-18 (Pose_Dual using dual networks)(256x192, COCO5K, 70 epochs)**: [pose_dual_18-COCO5K_e70-model_best.pth.tar](./pose_dual_18-COCO5K_e70-model_best.pth.tar)
* **ResNet-18 (Pose_Dual using dual networks)(256x192, COCO10K, 100 epochs)**: [pose_dual_18-COCO10K_e100-model_best.pth.tar](./pose_dual_18-COCO10K_e100-model_best.pth.tar)


**COCOall + COCOunlabel**

*trained on `labeled COCO train-set` and `unlabeled COCO unlabeled-set`*
* **ResNet-50 (Pose_Cons) (256x192, 400 epochs)**: [pose_cons_50-COCO_COCOunlabel_e400-model_best.pth.tar](./pose_cons_50-COCO_COCOunlabel_e400-model_best.pth.tar)
* **ResNet-101 (Pose_Cons) (256x192, 400 epochs)**: [pose_cons_101-COCO_COCOunlabel_e400-model_best.pth.tar](./pose_cons_101-COCO_COCOunlabel_e400-model_best.pth.tar)
* **HRNet-w48 (Pose_Cons) (384x288, 300 epochs)**: [pose_cons_w48-COCO_COCOunlabel_e300-model_best.pth.tar](./pose_cons_w48-COCO_COCOunlabel_e300-model_best.pth.tar)
* **ResNet-50 (Pose_Dual) (256x192, 400 epochs)**: [pose_dual_50-COCO_COCOunlabel_e400-model_best.pth.tar](./pose_dual_50-COCO_COCOunlabel_e400-model_best.pth.tar)
* **ResNet-101 (Pose_Dual) (256x192, 400 epochs)**: [pose_dual_101-COCO_COCOunlabel_e400-model_best.pth.tar](./pose_dual_101-COCO_COCOunlabel_e400-model_best.pth.tar)
* **HRNet-w48 (Pose_Dual) (384x288, 300 epochs)**: [pose_dual_w48-COCO_COCOunlabel_e300-model_best.pth.tar](./pose_dual_w48-COCO_COCOunlabel_e300-model_best.pth.tar)


**MPII + AIC**

*trained on `labeled MPII train-set` and `unlabeled AIC train-set`*
* **HRNet-w32 (Pose_Dual) (256x256, 400 epochs)** [*We are sorry that it cannot be released due to company copyright issues*]