File size: 677 Bytes
6306a19
0b69648
 
 
6306a19
0b69648
6306a19
0b69648
6306a19
0b69648
 
 
6306a19
0b69648
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
# T5-VAE-Python (flax) (WIP)

A Transformer-VAE made using flax.

It has been trained to interpolate on lines of Python code form the [python-lines dataset](https://huggingface.co/datasets/Fraser/python-lines).

Done as part of Huggingface community training ([see forum post](https://discuss.huggingface.co/t/train-a-vae-to-interpolate-on-english-sentences/7548)).

Builds on T5, using an autoencoder to convert it into an MMD-VAE.

## Setup

Follow all steps to install dependencies from https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md#tpu-vm

- [ ] Find dataset storage site.
- [ ] Ask JAX team for dataset storage.