File size: 1,619 Bytes
cb6d55e
 
 
2e2aa65
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e289b13
 
 
261129f
e289b13
 
65deab0
 
 
51bb867
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
---
license: mit
---

Pretrained models of our method **DirectMHP**

Title: *DirectMHP: Direct 2D Multi-Person Head Pose Estimation with Full-range Angles*

Paper link: https://arxiv.org/abs/2302.01110

Code link: https://github.com/hnuzhy/DirectMHP

# Mulit-Person Head Pose Estimation Task (trained on CMU-HPE)
* DirectMHP-S --> [cmu_s_1280_e200_t40_lw010_best.pt](./cmu_s_1280_e200_t40_lw010_best.pt)
* DirectMHP-M --> [cmu_m_1280_e200_t40_lw010_best.pt](./cmu_m_1280_e200_t40_lw010_best.pt)

# Mulit-Person Head Pose Estimation Task (trained on AGORA-HPE)
* DirectMHP-S --> [agora_s_1280_e300_t40_lw010_best.pt](./agora_s_1280_e300_t40_lw010_best.pt)
* DirectMHP-M --> [agora_m_1280_e300_t40_lw010_best.pt](./agora_m_1280_e300_t40_lw010_best.pt)

# Single HPE datasets with YOLOv5+COCO format
* Resorted images used in our DirectMHP: [300W-LP.zip](./300W_LP.zip), [AFLW2000.zip](./AFLW2000.zip) and [BIWI_test.zip](./BIWI_test.zip).
* Resorted corresponding json files: [train_300W_LP.json](./train_300W_LP.json), [val_AFLW2000.json](./val_AFLW2000.json) and [BIWI_test.json](./BIWI_test.json). 

# Single HPE Task Pretrained on WiderFace and Finetuning on 300W-LP
* DirectMHP-S --> [300wlp_s_512_e50_finetune_best.pt](./300wlp_s_512_e50_finetune_best.pt)
* DirectMHP-M --> [300wlp_m_512_e50_finetune_best.pt](./300wlp_m_512_e50_finetune_best.pt)

# Single HPE SixDRepNet Re-trained on AGORA-HPE and CMU-HPE
* AGORA-HPE --> [SixDRepNet_AGORA_bs256_e100_epoch_last.pth](./SixDRepNet_AGORA_bs256_e100_epoch_last.pth)
* CMU-HPE --> [SixDRepNet_CMU_bs256_e100_epoch_last.pth](./SixDRepNet_CMU_bs256_e100_epoch_last.pth)