|
--- |
|
language: eng |
|
license: mit |
|
library_name: transformers |
|
pipeline_tag: translation |
|
widget: |
|
- text: Hi my name is Sarah |
|
- text: Putin is the President of Russia |
|
- text: I will send you a message on Facebook |
|
--- |
|
|
|
# Model Card for DarijaTranslation-V1 |
|
|
|
This model translates text from English to Darija (Moroccan Arabic). |
|
|
|
|
|
### Model Description |
|
|
|
This model, designed for translating text from English to Darija (Moroccan Arabic), excels in handling general, everyday language such as greetings ("hi, how are you?"). It accurately translates common phrases and sentences typically encountered in informal communication. |
|
|
|
- **Developed by:** BAKKALI AYOUB |
|
- **Model type:** Translation |
|
- **Language(s) (NLP):** English to Darija (Moroccan Arabic) |
|
- **License:** [More Information Needed] |
|
- **Finetuned from model [optional]:** marefa-nlp/marefa-mt-en-ar |
|
|
|
## How to Get Started with the Model |
|
|
|
Use the code below to get started with the model: |
|
|
|
```python |
|
from transformers import pipeline |
|
|
|
# Initialize the translation pipeline |
|
pipe = pipeline("translation", model="BAKKALIAYOUB/DarijaTranslation-V1") |
|
|
|
# Translate text |
|
translated_text = pipe("putin is the president of russia") |
|
print(translated_text) |
|
``` |
|
|
|
# Training Details |
|
## Training Data |
|
The training data are from atlasia/darija-translation dataset |
|
|
|
#### Training Hyperparameters |
|
|
|
- **Training regime**: fp16 mixed precision |
|
- **Epochs** : 3 |
|
- **Learning rate** : 2e-5 |
|
- **Batch size** : 16 |
|
#### Speeds, Sizes, Times |
|
|
|
- **Hardware** : GPU P100 ith 16 GB memory |
|
- **Training**: |
|
| Epoch | Training Loss | Validation Loss | |
|
|---------|---------------|-----------------| |
|
| 1 | 0.349600 | 0.311435 | |
|
| 2 | 0.305100 | 0.280260 | |
|
| 3 | 0.277700 | 0.268511 | |
|
| 4 | 0.270000 | 0.264618 | |
|
|
|
|