File size: 735 Bytes
bacc8b4
 
 
 
f3afc34
bacc8b4
 
 
 
 
 
 
f3afc34
bacc8b4
f3afc34
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
language: en
tags:
- pythae
- reproducibility
license: apache-2.0
---

### Downloading this model from the Hub
This model was trained with pythae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from pythae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="clementchadebec/reproduced_wae")
```
## Reproducibility
This trained model reproduces the results of Table 1 in [1].

| Model | Dataset | Metric | Obtained value | Reference value |
|:---:|:---:|:---:|:---:|:---:|
| WAE | CELEBA 64 | FID | 56.5 | 55 |

[1] Tolstikhin, O Bousquet, S Gelly, and B Schölkopf. Wasserstein auto-encoders. In 6th International Conference on Learning Representations (ICLR 2018), 2018.