Gorka Urbizu Garmendia commited on
Commit
e2f7ea6
1 Parent(s): 5b3edfc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -21,7 +21,7 @@ To train ElhBERTeu, we collected different corpora sources from several domains:
21
  |Others | 7M |
22
  |Total | 575M |
23
 
24
- ElhBERTeu is a base, uncased monolingual BERT model for Basque, with a vocab size of 50K. which sums up to 124M parameters in total.
25
 
26
  ElhBERTeu was trained following the design decisions for [BERTeus](https://huggingface.co/ixa-ehu/berteus-base-cased). The tokenizer and the hyper-parameter settings remained the same, with the only difference being that the full pre-training of the model (1M steps) was performed with a sequence length of 512 on a v3-8 TPU.
27
 
 
21
  |Others | 7M |
22
  |Total | 575M |
23
 
24
+ ElhBERTeu is a base, uncased monolingual BERT model for Basque, with a vocab size of 50K, which has 124M parameters in total.
25
 
26
  ElhBERTeu was trained following the design decisions for [BERTeus](https://huggingface.co/ixa-ehu/berteus-base-cased). The tokenizer and the hyper-parameter settings remained the same, with the only difference being that the full pre-training of the model (1M steps) was performed with a sequence length of 512 on a v3-8 TPU.
27