--- license: mit base_model: xlm-roberta-base tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: ner_model results: [] datasets: - pythainlp/thainer-corpus-v2 language: - th --- # ner_model This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.1247 - Precision: 0.8073 - Recall: 0.8695 - F1: 0.8372 - Accuracy: 0.9655 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | No log | 0.4 | 100 | 0.5360 | 0.4604 | 0.4644 | 0.4624 | 0.8846 | | No log | 0.81 | 200 | 0.2882 | 0.6137 | 0.6619 | 0.6369 | 0.9307 | | No log | 1.21 | 300 | 0.2128 | 0.7236 | 0.7649 | 0.7437 | 0.9442 | | No log | 1.62 | 400 | 0.1811 | 0.7146 | 0.7925 | 0.7515 | 0.9494 | | 0.4608 | 2.02 | 500 | 0.1594 | 0.7369 | 0.8021 | 0.7681 | 0.9542 | | 0.4608 | 2.43 | 600 | 0.1532 | 0.7494 | 0.8331 | 0.7890 | 0.9572 | | 0.4608 | 2.83 | 700 | 0.1403 | 0.7660 | 0.8417 | 0.8021 | 0.9594 | | 0.4608 | 3.24 | 800 | 0.1342 | 0.7909 | 0.8428 | 0.8160 | 0.9625 | | 0.4608 | 3.64 | 900 | 0.1325 | 0.7867 | 0.8572 | 0.8204 | 0.9626 | | 0.1256 | 4.05 | 1000 | 0.1275 | 0.8056 | 0.8632 | 0.8334 | 0.9648 | | 0.1256 | 4.45 | 1100 | 0.1229 | 0.8131 | 0.8643 | 0.8379 | 0.9657 | | 0.1256 | 4.86 | 1200 | 0.1247 | 0.8073 | 0.8695 | 0.8372 | 0.9655 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0