t5-vae-python / README.md
Fraser's picture
run training with easy setup
6306a19

T5-VAE-Python (flax) (WIP)

A Transformer-VAE made using flax.

It has been trained to interpolate on lines of Python code form the python-lines dataset.

Done as part of Huggingface community training (see forum post).

Builds on T5, using an autoencoder to convert it into an MMD-VAE.

Setup

Follow all steps to install dependencies from https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md#tpu-vm

  • Find dataset storage site.
  • Ask JAX team for dataset storage.