gpt-neo-1.3B-dutch / README.md
yhavinga's picture
Add config and scripts
4047beb
metadata
language: nl
widget:
  - text: In het jaar 2030 zullen we
  - text: Toen ik gisteren volledig in de ban was van
  - text: >-
      Studenten en leraren van de Bogazici Universiteit in de Turkse stad
      Istanbul
  - text: In Israël was een strenge lockdown
tags:
  - gpt-neo-1.3B
  - gpt-neo
pipeline_tag: text-generation
datasets:
  - yhavinga/mc4_nl_cleaned

GPT Neo 1.3B pre-trained on cleaned Dutch mC4 🇳🇱

NB: Training in progress.

Dataset:

Tokenizer:

  • Tokenizer trained on mC4 with scripts from the Huggingface Transformers Flax examples

Training details:

  • Trained for ? (1 jan 2022)
  • Block size: 512
  • Optimizer: adafactor
  • lr: 5e-5
  • Batch size: 64
  • Warmup steps: 5000

Work in progress. Jan2022