--- language: - mn base_model: bayartsogt/mongolian-roberta-base tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: roberta-base-ner-demo results: [] --- # roberta-base-ner-demo This model is a fine-tuned version of [bayartsogt/mongolian-roberta-base](https://huggingface.co/bayartsogt/mongolian-roberta-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.1307 - Precision: 0.9299 - Recall: 0.9402 - F1: 0.9350 - Accuracy: 0.9805 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 32 - eval_batch_size: 64 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 15 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-------:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.7194 | 0.9958 | 119 | 0.1195 | 0.7550 | 0.8328 | 0.7920 | 0.9602 | | 0.103 | 2.0 | 239 | 0.0894 | 0.8341 | 0.8782 | 0.8556 | 0.9695 | | 0.0517 | 2.9958 | 358 | 0.0761 | 0.9138 | 0.9321 | 0.9228 | 0.9792 | | 0.0255 | 4.0 | 478 | 0.0921 | 0.9118 | 0.9287 | 0.9202 | 0.9778 | | 0.016 | 4.9958 | 597 | 0.0945 | 0.9242 | 0.9343 | 0.9292 | 0.9794 | | 0.0102 | 6.0 | 717 | 0.0978 | 0.9266 | 0.9382 | 0.9324 | 0.9801 | | 0.0066 | 6.9958 | 836 | 0.1092 | 0.9265 | 0.9368 | 0.9316 | 0.9800 | | 0.005 | 8.0 | 956 | 0.1150 | 0.9228 | 0.9366 | 0.9297 | 0.9796 | | 0.0034 | 8.9958 | 1075 | 0.1189 | 0.9274 | 0.9373 | 0.9323 | 0.9800 | | 0.003 | 10.0 | 1195 | 0.1242 | 0.9215 | 0.9360 | 0.9287 | 0.9793 | | 0.0025 | 10.9958 | 1314 | 0.1288 | 0.9256 | 0.9375 | 0.9315 | 0.9797 | | 0.0016 | 12.0 | 1434 | 0.1318 | 0.9273 | 0.9365 | 0.9319 | 0.9799 | | 0.0015 | 12.9958 | 1553 | 0.1314 | 0.9286 | 0.9394 | 0.9340 | 0.9801 | | 0.0013 | 14.0 | 1673 | 0.1308 | 0.9290 | 0.9393 | 0.9341 | 0.9803 | | 0.0012 | 14.9372 | 1785 | 0.1307 | 0.9299 | 0.9402 | 0.9350 | 0.9805 | ### Framework versions - Transformers 4.40.2 - Pytorch 2.2.1+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1