tielectra-small / README.md
fgaim's picture
Update README
f822d94
---
language: ti
widget:
- text: "ዓቕሚ መንእሰይ ኤርትራ [MASK] ተራእዩ"
---
# Pre-trained ELECTRA small for Tigrinya Language
We pre-train ELECTRA small on the [TLMD](https://zenodo.org/record/5139094) dataset, with over 40 million tokens.
Contained are trained Flax and PyTorch models.
## Hyperparameters
The hyperparameters corresponding to model sizes mentioned above are as follows:
| Model Size | L | AH | HS | FFN | P | Seq |
|------------|----|----|-----|------|------|------|
| SMALL | 12 | 4 | 256 | 1024 | 14M | 512 |
(L = number of layers; AH = number of attention heads; HS = hidden size; FFN = feedforward network dimension; P = number of parameters; Seq = maximum sequence length.)
### Framework versions
- Transformers 4.12.0.dev0
- Pytorch 1.9.0+cu111
- Datasets 1.13.3
- Tokenizers 0.10.3
## Citation
If you use this model in your product or research, please cite as follows:
```
@article{Fitsum2021TiPLMs,
author={Fitsum Gaim and Wonsuk Yang and Jong C. Park},
title={Monolingual Pre-trained Language Models for Tigrinya},
year=2021,
publisher={WiNLP 2021 at EMNLP 2021}
}
```