stefan-it commited on
Commit
7f8255f
1 Parent(s): b495a24

readme: add initial version

Browse files
Files changed (1) hide show
  1. README.md +43 -0
README.md ADDED
@@ -0,0 +1,43 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ language:
4
+ - en
5
+ - de
6
+ - fr
7
+ - fi
8
+ - sv
9
+ - nl
10
+ ---
11
+
12
+ # hmByT5 - Preliminary Language Models
13
+
14
+ Preliminary Historic Multilingual and Monolingual ByT5 Models. Following languages are currently covered:
15
+
16
+ * English (British Library Corpus - Books)
17
+ * German (Europeana Newspaper)
18
+ * French (Europeana Newspaper)
19
+ * Finnish (Europeana Newspaper)
20
+ * Swedish (Europeana Newspaper)
21
+ * Dutch (Delpher Corpus)
22
+
23
+ More details can be found in [our GitHub repository](https://github.com/stefan-it/hmByT5).
24
+
25
+ In this experiment we sample 4B bytes (~4GB of text) from each corpora (and upsample Swedish and Finnish).
26
+
27
+ # Pretraining
28
+
29
+ We use the official JAX/FLAX example in Hugging Face Transformers to pretrain a ByT5 model on a single v3-8 TPU.
30
+ Details about the training can be found [here](https://github.com/stefan-it/hmByT5/tree/main/hmbyt5-flax).
31
+
32
+ # Evaluation on Downstream Tasks (NER)
33
+
34
+ We evaluated the hmByT5 model on downstream tasks:
35
+
36
+ | Model | English AjMC | German AjMC | French AjMC | Finnish NewsEye | Swedish NewsEye | Dutch ICDAR | French ICDAR | Avg. |
37
+ |---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------|--------------|--------------|-----------------|-----------------|--------------|--------------|------|
38
+ | [`hmbyt5-preliminary/byt5-small-multilingual-4g`](https://huggingface.co/hmbyt5-preliminary/byt5-small-multilingual-4g) | 83.49 ± 0.96 | 87.65 ± 0.63 | 84.16 ± 0.90 | | | | | |
39
+
40
+ # Acknowledgements
41
+
42
+ Research supported with Cloud TPUs from Google's [TPU Research Cloud](https://sites.research.google/trc/about/) (TRC).
43
+ Many Thanks for providing access to the TPUs ❤️