pourmand1376's picture
Update README.md
3d7c34b
|
raw
history blame
No virus
964 Bytes
metadata
license: gpl-2.0
language: ar

A model which is jointly trained and fine-tuned on Quran, Saheefa and nahj-al-balaqa. All Datasets are available Here. Code will be available soon ...

Some Examples for filling the mask:

  • 
    

ุฐูŽู„ููƒูŽ [MASK] ู„ูŽุง ุฑูŽูŠู’ุจูŽ ูููŠู‡ู ู‡ูุฏู‹ู‰ ู„ูู„ู’ู…ูุชู‘ูŽู‚ููŠู†ูŽ

- ```
ูŠูŽุง ุฃูŽูŠู‘ูู‡ูŽุง ุงู„ู†ู‘ูŽุงุณู ุงุนู’ุจูุฏููˆุง ุฑูŽุจู‘ูŽูƒูู…ู ุงู„ู‘ูŽุฐููŠ ุฎูŽู„ูŽู‚ูŽูƒูู…ู’ ูˆูŽุงู„ู‘ูŽุฐููŠู†ูŽ ู…ูู†ู’ ู‚ูŽุจู’ู„ููƒูู…ู’ ู„ูŽุนูŽู„ู‘ูŽูƒูู…ู’ [MASK]```

This model is fine-tuned on [Bert Base Arabic](https://huggingface.co/asafaya/bert-base-arabic) for 30 epochs. We have used `Masked Language Modeling` to fine-tune the model. Also, after each 5 epochs, we have completely masked the words again for the model to learn the embeddings very well and not overfit the data.