byt5-small-english / README.md
stefan-it's picture
readme: add initial version
90beea5
metadata
license: mit
language:
  - en

hmByT5 - Preliminary Language Models

Preliminary Historic Multilingual and Monolingual ByT5 Models. Following languages are currently covered:

  • English (British Library Corpus - Books)

More details can be found in our GitHub repository.

Pretraining

We use the official JAX/FLAX example in Hugging Face Transformers to pretrain a ByT5 model on a single v3-8 TPU. Details about the training can be found here.

Evaluation on Downstream Tasks (NER)

We evaluated the hmByT5 model on downstream tasks:

Model English AjMC German AjMC French AjMC Finnish NewsEye Swedish NewsEye Dutch ICDAR French ICDAR Avg.
hmbyt5/byt5-small-english 85.65 ± 1.21 87.27 ± 0.50 84.44 ± 0.79

Acknowledgements

Research supported with Cloud TPUs from Google's TPU Research Cloud (TRC). Many Thanks for providing access to the TPUs ❤️