--- tags: - generated_from_trainer model-index: - name: arzwiki_mlm results: [] metrics: - perplexity license: mit datasets: - SaiedAlshahrani/Egyptian_Arabic_Wikipedia_20230101 language: - ar library_name: transformers pipeline_tag: fill-mask widget: - text: الهدف من الحياة هو --- # arzwiki_mlm (arzRoBERTa) This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset. It achieves the following results on the evaluation set: - Pseudo-Perplexity: ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 256 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-06 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Epoch | Step | Training Loss | |:-----:|:-----:|:-------------:| | 1 | 2500 | 2.038300 | | 2 | 5000 | 0.878800 | | 3 | 7500 | 0.682800 | | 4 | 10000 | 0.613100 | | 5 | 12500 | 0.574500 | | Train Runtime | Train Samples Per Second | Train Steps Per Second | Total Flos | Train Loss | Epoch | |:--------------:|:------------------------:|:----------------------:|:-------------------------:|:----------:|:--------:| | 14677.117400 | 248.119000 | 0.970000 | 120746231839334400.000000 | 0.908513 | 5.000000 | ### Framework versions - Datasets 2.9.0 - Tokenizers 0.12.1 - Transformers 4.24.0 - Pytorch 1.12.1+cu116