README / README.md
robot-bengali-2's picture
Update readme
d016be4 unverified
|
raw
history blame
2.14 kB
metadata
title: README
emoji: πŸ‘
colorFrom: pink
colorTo: indigo
sdk: static
pinned: false

This organization is a part of the NeurIPS 2021 demonstration "Training Transformers Together".

In this demo, we train a model similar to OpenAI DALL-E β€” a Transformer "language model" that generates images from text descriptions. It is trained on LAION-400M, the world's largest openly available image-text-pair dataset with 400 million samples. Our model is based on the dalle‑pytorch implementation by Phil Wang with a few tweaks to make it communication-efficient.

See details about how to join and how it works on our website.

This organization gathers people participating in the collaborative training and provides links to the necessary resources:

  • πŸ‘‰ Starter kits for Google Colab and Kaggle (easy way to join the training)
  • πŸ‘‰ Dashboard (the current training state: loss, number of peers, etc.)
  • πŸ‘‰ Model (the latest checkpoint)
  • πŸ‘‰ Dataset

Feel free to reach us on Discord if you have any questions πŸ™‚