phunganhsang's picture
End of training
6460f9d verified
metadata
base_model: vinai/phobert-base-v2
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
model-index:
  - name: PhoBert_Lexical_Datasnet51KBoDuoiWithNewLexical
    results: []

Visualize in Weights & Biases

PhoBert_Lexical_Datasnet51KBoDuoiWithNewLexical

This model is a fine-tuned version of vinai/phobert-base-v2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9059
  • Accuracy: 0.848
  • F1: 0.8469

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 0.2506 200 0.6421 0.727 0.7274
No log 0.5013 400 0.5851 0.745 0.7340
No log 0.7519 600 0.5522 0.758 0.7561
0.3486 1.0025 800 0.5461 0.779 0.7693
0.3486 1.2531 1000 0.5009 0.786 0.7796
0.3486 1.5038 1200 0.5538 0.77 0.7681
0.3486 1.7544 1400 0.5067 0.777 0.7676
0.2582 2.0050 1600 0.5031 0.792 0.7842
0.2582 2.2556 1800 0.5232 0.786 0.7811
0.2582 2.5063 2000 0.5484 0.785 0.7827
0.2582 2.7569 2200 0.5155 0.798 0.7950
0.2134 3.0075 2400 0.5280 0.787 0.7847
0.2134 3.2581 2600 0.5263 0.793 0.7860
0.2134 3.5088 2800 0.5273 0.794 0.7930
0.2134 3.7594 3000 0.5354 0.792 0.7871
0.1799 4.0100 3200 0.5071 0.804 0.7982
0.1799 4.2607 3400 0.5655 0.802 0.7975
0.1799 4.5113 3600 0.5497 0.809 0.8034
0.1799 4.7619 3800 0.5645 0.808 0.8020
0.1493 5.0125 4000 0.5565 0.807 0.8003
0.1493 5.2632 4200 0.5861 0.813 0.8091
0.1493 5.5138 4400 0.6189 0.807 0.8055
0.1493 5.7644 4600 0.5003 0.819 0.8151
0.1278 6.0150 4800 0.5903 0.81 0.8080
0.1278 6.2657 5000 0.6112 0.81 0.8090
0.1278 6.5163 5200 0.5987 0.818 0.8150
0.1278 6.7669 5400 0.6466 0.802 0.8006
0.1091 7.0175 5600 0.6515 0.818 0.8172
0.1091 7.2682 5800 0.6541 0.819 0.8174
0.1091 7.5188 6000 0.6148 0.828 0.8246
0.1091 7.7694 6200 0.6246 0.827 0.8262
0.0917 8.0201 6400 0.6751 0.812 0.812
0.0917 8.2707 6600 0.6371 0.829 0.8281
0.0917 8.5213 6800 0.6979 0.823 0.8212
0.0917 8.7719 7000 0.6792 0.827 0.8241
0.0764 9.0226 7200 0.6833 0.835 0.8324
0.0764 9.2732 7400 0.6810 0.84 0.8383
0.0764 9.5238 7600 0.6799 0.826 0.8233
0.0764 9.7744 7800 0.6988 0.823 0.8211
0.0676 10.0251 8000 0.7213 0.832 0.8299
0.0676 10.2757 8200 0.7325 0.83 0.8288
0.0676 10.5263 8400 0.7338 0.831 0.8293
0.0676 10.7769 8600 0.7210 0.827 0.8262
0.0585 11.0276 8800 0.7874 0.829 0.8282
0.0585 11.2782 9000 0.7764 0.836 0.8347
0.0585 11.5288 9200 0.8112 0.835 0.8342
0.0585 11.7794 9400 0.7485 0.832 0.8307
0.0536 12.0301 9600 0.7640 0.835 0.8339
0.0536 12.2807 9800 0.7865 0.837 0.8344
0.0536 12.5313 10000 0.7579 0.836 0.8350
0.0536 12.7820 10200 0.7846 0.837 0.8357
0.0449 13.0326 10400 0.7385 0.84 0.8384
0.0449 13.2832 10600 0.8481 0.838 0.8372
0.0449 13.5338 10800 0.8422 0.838 0.8368
0.0449 13.7845 11000 0.7742 0.846 0.8445
0.0387 14.0351 11200 0.8304 0.842 0.8406
0.0387 14.2857 11400 0.7922 0.84 0.8386
0.0387 14.5363 11600 0.8284 0.843 0.8415
0.0387 14.7870 11800 0.8289 0.846 0.8449
0.0345 15.0376 12000 0.8012 0.849 0.8475
0.0345 15.2882 12200 0.8320 0.85 0.8489
0.0345 15.5388 12400 0.8118 0.853 0.8516
0.0345 15.7895 12600 0.8084 0.849 0.8480
0.0292 16.0401 12800 0.8427 0.841 0.8399
0.0292 16.2907 13000 0.8499 0.848 0.8469
0.0292 16.5414 13200 0.8477 0.848 0.8470
0.0292 16.7920 13400 0.8211 0.846 0.8441
0.0263 17.0426 13600 0.8418 0.848 0.8466
0.0263 17.2932 13800 0.8637 0.842 0.8408
0.0263 17.5439 14000 0.8693 0.845 0.8440
0.0263 17.7945 14200 0.8415 0.85 0.8483
0.0243 18.0451 14400 0.8735 0.848 0.8470
0.0243 18.2957 14600 0.8827 0.847 0.8460
0.0243 18.5464 14800 0.8729 0.85 0.8489
0.0243 18.7970 15000 0.8964 0.848 0.8469
0.0207 19.0476 15200 0.8939 0.846 0.8449
0.0207 19.2982 15400 0.9019 0.847 0.8460
0.0207 19.5489 15600 0.8948 0.848 0.8469
0.0207 19.7995 15800 0.9059 0.848 0.8469

Framework versions

  • Transformers 4.43.1
  • Pytorch 2.1.2
  • Datasets 2.20.0
  • Tokenizers 0.19.1