File size: 735 Bytes
fc3bf0c
 
 
 
3f5ec81
fc3bf0c
 
 
 
 
 
 
3f5ec81
fc3bf0c
3f5ec81
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
language: en
tags:
- pythae
- reproducibility
license: apache-2.0
---

### Downloading this model from the Hub
This model was trained with pythae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from pythae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="clementchadebec/reproduced_aae")
```
## Reproducibility
This trained model reproduces the results of Table 1 in [1].

| Model | Dataset | Metric | Obtained value | Reference value |
|:---:|:---:|:---:|:---:|:---:|
| AAE | CELEBA 64 | FID | 43.3 | 42 |

[1] Tolstikhin, O Bousquet, S Gelly, and B Schölkopf. Wasserstein auto-encoders. In 6th International Conference on Learning Representations (ICLR 2018), 2018.