haryoaw's picture
Upload tokenizer
570e530 verified
metadata
base_model: FacebookAI/xlm-roberta-base
library_name: transformers
license: mit
metrics:
  - precision
  - recall
  - f1
  - accuracy
tags:
  - generated_from_trainer
model-index:
  - name: scenario-kd-pre-ner-full-xlmr_data-univner_full44
    results: []

scenario-kd-pre-ner-full-xlmr_data-univner_full44

This model is a fine-tuned version of FacebookAI/xlm-roberta-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 48.3210
  • Precision: 0.8129
  • Recall: 0.8285
  • F1: 0.8206
  • Accuracy: 0.9813

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 8
  • eval_batch_size: 32
  • seed: 44
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
126.6202 0.2911 500 87.2751 0.7296 0.7159 0.7227 0.9734
79.5244 0.5822 1000 74.0801 0.7585 0.7846 0.7713 0.9774
71.0482 0.8732 1500 69.1703 0.7824 0.7813 0.7818 0.9785
66.2795 1.1643 2000 65.4836 0.8008 0.7984 0.7996 0.9800
62.9041 1.4554 2500 63.7689 0.8089 0.7806 0.7945 0.9794
60.9832 1.7465 3000 61.2403 0.7992 0.8150 0.8071 0.9804
59.0595 2.0375 3500 59.8787 0.7874 0.8251 0.8058 0.9800
56.95 2.3286 4000 58.4894 0.8164 0.8054 0.8109 0.9807
55.6946 2.6197 4500 57.2299 0.8001 0.8189 0.8094 0.9809
54.6743 2.9108 5000 56.1128 0.8083 0.8231 0.8156 0.9809
53.4043 3.2019 5500 55.4426 0.8135 0.8080 0.8107 0.9806
52.4045 3.4929 6000 54.4679 0.7964 0.8325 0.8141 0.9809
51.6726 3.7840 6500 53.7073 0.8099 0.8306 0.8201 0.9813
50.959 4.0751 7000 53.1030 0.8108 0.8272 0.8189 0.9812
50.2364 4.3662 7500 52.5818 0.8055 0.8342 0.8196 0.9815
49.7482 4.6573 8000 52.1174 0.8119 0.8306 0.8211 0.9816
49.2511 4.9483 8500 51.6036 0.8157 0.8214 0.8185 0.9812
48.5192 5.2394 9000 51.1405 0.8103 0.8285 0.8193 0.9810
48.1685 5.5305 9500 50.7789 0.8189 0.8300 0.8244 0.9818
47.9619 5.8216 10000 50.5119 0.8044 0.8344 0.8191 0.9813
47.4978 6.1126 10500 50.1726 0.817 0.8251 0.8210 0.9818
47.1816 6.4037 11000 49.9418 0.8162 0.8260 0.8211 0.9819
46.9279 6.6948 11500 49.6736 0.8223 0.8290 0.8256 0.9818
46.5873 6.9859 12000 49.4774 0.8228 0.8303 0.8266 0.9821
46.3493 7.2770 12500 49.1917 0.8194 0.8270 0.8232 0.9817
46.133 7.5680 13000 48.9379 0.8227 0.8370 0.8298 0.9821
46.0883 7.8591 13500 48.9742 0.8248 0.8254 0.8251 0.9819
45.812 8.1502 14000 48.6892 0.8200 0.8332 0.8265 0.9818
45.6582 8.4413 14500 48.5991 0.8153 0.8339 0.8245 0.9820
45.6454 8.7324 15000 48.5257 0.8204 0.8290 0.8247 0.9819
45.4808 9.0234 15500 48.4097 0.8107 0.8270 0.8188 0.9815
45.3486 9.3145 16000 48.4108 0.8191 0.8290 0.8240 0.9819
45.2876 9.6056 16500 48.3039 0.8148 0.8326 0.8236 0.9819
45.3167 9.8967 17000 48.3210 0.8129 0.8285 0.8206 0.9813

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.1.1+cu121
  • Datasets 2.14.5
  • Tokenizers 0.19.1