ner_base

This model is a fine-tuned version of vinai/phobert-base on the hts98/UIT dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6160
  • Precision: 0.6525
  • Recall: 0.7066
  • F1: 0.6785
  • Accuracy: 0.8276

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 120.0

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 244 0.8214 0.4246 0.5544 0.4809 0.7636
No log 2.0 488 0.6734 0.5305 0.6159 0.5700 0.8023
0.9764 3.0 732 0.6425 0.5591 0.6519 0.6020 0.8072
0.9764 4.0 976 0.6203 0.5664 0.6731 0.6151 0.8215
0.4504 5.0 1220 0.6483 0.5879 0.6795 0.6304 0.8171
0.4504 6.0 1464 0.6828 0.5478 0.6826 0.6078 0.8138
0.2877 7.0 1708 0.7097 0.5868 0.6803 0.6301 0.8115
0.2877 8.0 1952 0.7538 0.5864 0.6884 0.6334 0.8133
0.1968 9.0 2196 0.7853 0.5949 0.6837 0.6362 0.8129
0.1968 10.0 2440 0.8311 0.5984 0.6815 0.6373 0.8094
0.1443 11.0 2684 0.7910 0.6101 0.6898 0.6475 0.8190
0.1443 12.0 2928 0.8414 0.5927 0.6803 0.6335 0.8147
0.1118 13.0 3172 0.8946 0.6023 0.6857 0.6413 0.8069
0.1118 14.0 3416 0.9195 0.6055 0.6843 0.6425 0.8130
0.0838 15.0 3660 0.9149 0.6020 0.6882 0.6422 0.8192
0.0838 16.0 3904 0.9357 0.6072 0.6893 0.6457 0.8201
0.0661 17.0 4148 0.9784 0.6033 0.6887 0.6432 0.8173
0.0661 18.0 4392 0.9842 0.6121 0.6868 0.6473 0.8184
0.0514 19.0 4636 1.0097 0.6105 0.6896 0.6476 0.8164
0.0514 20.0 4880 1.0300 0.6086 0.6946 0.6488 0.8167
0.0416 21.0 5124 1.0250 0.6190 0.6904 0.6528 0.8205
0.0416 22.0 5368 1.0879 0.6171 0.6937 0.6532 0.8167
0.0324 23.0 5612 1.1349 0.6093 0.6817 0.6435 0.8127
0.0324 24.0 5856 1.0994 0.6191 0.6935 0.6542 0.8181
0.0277 25.0 6100 1.1401 0.6180 0.6951 0.6543 0.8153
0.0277 26.0 6344 1.1868 0.5984 0.6904 0.6411 0.8084
0.0223 27.0 6588 1.2052 0.6282 0.6859 0.6558 0.8139
0.0223 28.0 6832 1.1964 0.6168 0.6935 0.6529 0.8153
0.019 29.0 7076 1.1898 0.6214 0.6887 0.6533 0.8202
0.019 30.0 7320 1.2819 0.6136 0.6946 0.6516 0.8135
0.0159 31.0 7564 1.2687 0.6093 0.6912 0.6477 0.8128
0.0159 32.0 7808 1.2997 0.6128 0.6971 0.6522 0.8159
0.0141 33.0 8052 1.2800 0.6172 0.6843 0.6490 0.8157
0.0141 34.0 8296 1.3110 0.6220 0.6901 0.6543 0.8141
0.0107 35.0 8540 1.3343 0.6081 0.6910 0.6469 0.8160
0.0107 36.0 8784 1.3406 0.6111 0.6926 0.6493 0.8130
0.0106 37.0 9028 1.3921 0.6080 0.6876 0.6454 0.8127
0.0106 38.0 9272 1.4061 0.6086 0.6868 0.6453 0.8100
0.0088 39.0 9516 1.3828 0.6293 0.6921 0.6592 0.8166
0.0088 40.0 9760 1.4263 0.6242 0.6940 0.6572 0.8130
0.0086 41.0 10004 1.3521 0.6202 0.6993 0.6574 0.8185
0.0086 42.0 10248 1.3722 0.6451 0.6999 0.6714 0.8196
0.0086 43.0 10492 1.3784 0.6373 0.6946 0.6647 0.8226
0.0075 44.0 10736 1.4340 0.6334 0.6940 0.6623 0.8140
0.0075 45.0 10980 1.3902 0.6321 0.7002 0.6644 0.8161
0.0066 46.0 11224 1.4019 0.6230 0.6985 0.6586 0.8162
0.0066 47.0 11468 1.4320 0.6183 0.6960 0.6548 0.8161
0.0067 48.0 11712 1.4461 0.6326 0.6999 0.6645 0.8200
0.0067 49.0 11956 1.4327 0.6249 0.6982 0.6595 0.8202
0.0054 50.0 12200 1.4616 0.6348 0.6943 0.6632 0.8176
0.0054 51.0 12444 1.4537 0.6135 0.7010 0.6543 0.8177
0.0052 52.0 12688 1.5622 0.6265 0.6884 0.6560 0.8096
0.0052 53.0 12932 1.4217 0.6348 0.7027 0.6670 0.8236
0.0051 54.0 13176 1.4625 0.6266 0.6965 0.6597 0.8217
0.0051 55.0 13420 1.4359 0.6393 0.6918 0.6645 0.8257
0.0049 56.0 13664 1.4617 0.6447 0.6977 0.6702 0.8231
0.0049 57.0 13908 1.5171 0.6337 0.6951 0.6630 0.8181
0.0037 58.0 14152 1.4999 0.6339 0.7032 0.6668 0.8206
0.0037 59.0 14396 1.4841 0.6269 0.7007 0.6617 0.8208
0.004 60.0 14640 1.4361 0.6381 0.7044 0.6696 0.8244
0.004 61.0 14884 1.4800 0.6425 0.7035 0.6716 0.8235
0.004 62.0 15128 1.4700 0.6330 0.6991 0.6644 0.8241
0.004 63.0 15372 1.5107 0.6309 0.7016 0.6644 0.8212
0.0037 64.0 15616 1.5132 0.6389 0.7024 0.6691 0.8227
0.0037 65.0 15860 1.5229 0.6287 0.7016 0.6631 0.8239
0.0033 66.0 16104 1.5574 0.6395 0.7027 0.6696 0.8242
0.0033 67.0 16348 1.5216 0.6270 0.7052 0.6638 0.8196
0.0033 68.0 16592 1.4877 0.6347 0.6951 0.6636 0.8242
0.0033 69.0 16836 1.5373 0.6281 0.7021 0.6631 0.8195
0.0026 70.0 17080 1.5522 0.6335 0.7002 0.6652 0.8201
0.0026 71.0 17324 1.5180 0.6380 0.7035 0.6691 0.8227
0.0024 72.0 17568 1.5517 0.6504 0.6974 0.6730 0.8218
0.0024 73.0 17812 1.5392 0.6332 0.7021 0.6659 0.8206
0.0026 74.0 18056 1.5396 0.6415 0.7010 0.6700 0.8246
0.0026 75.0 18300 1.5638 0.6500 0.6999 0.6740 0.8233
0.0019 76.0 18544 1.5790 0.6438 0.6912 0.6667 0.8202
0.0019 77.0 18788 1.5546 0.6500 0.7052 0.6765 0.8216
0.0029 78.0 19032 1.5374 0.6369 0.7032 0.6684 0.8236
0.0029 79.0 19276 1.5923 0.6351 0.6982 0.6652 0.8180
0.0015 80.0 19520 1.5728 0.6354 0.7027 0.6674 0.8246
0.0015 81.0 19764 1.5646 0.6417 0.6979 0.6686 0.8229
0.0019 82.0 20008 1.5845 0.6247 0.7030 0.6615 0.8211
0.0019 83.0 20252 1.5894 0.6424 0.6935 0.6669 0.8193
0.0019 84.0 20496 1.6702 0.6428 0.6907 0.6659 0.8169
0.0012 85.0 20740 1.6313 0.6342 0.7027 0.6667 0.8189
0.0012 86.0 20984 1.5829 0.6357 0.7038 0.6680 0.8232
0.0015 87.0 21228 1.6056 0.6396 0.7085 0.6723 0.8210
0.0015 88.0 21472 1.5823 0.6471 0.7105 0.6773 0.8225
0.0015 89.0 21716 1.5736 0.6367 0.7016 0.6676 0.8249
0.0015 90.0 21960 1.5921 0.6457 0.7016 0.6725 0.8236
0.0012 91.0 22204 1.6114 0.6371 0.7030 0.6684 0.8231
0.0012 92.0 22448 1.5752 0.6408 0.7038 0.6708 0.8245
0.0014 93.0 22692 1.6123 0.6360 0.7018 0.6673 0.8217
0.0014 94.0 22936 1.6183 0.6374 0.7021 0.6682 0.8221
0.0009 95.0 23180 1.6078 0.6474 0.6988 0.6721 0.8275
0.0009 96.0 23424 1.6201 0.6401 0.6991 0.6683 0.8246
0.0008 97.0 23668 1.6216 0.6388 0.7016 0.6687 0.8238
0.0008 98.0 23912 1.6113 0.6410 0.7024 0.6703 0.8244
0.0011 99.0 24156 1.5995 0.6497 0.7027 0.6752 0.8245
0.0011 100.0 24400 1.5953 0.6423 0.7027 0.6711 0.8259
0.0009 101.0 24644 1.6178 0.6447 0.7027 0.6725 0.8248
0.0009 102.0 24888 1.6171 0.6408 0.7066 0.6721 0.8257
0.0006 103.0 25132 1.6054 0.6508 0.7077 0.6781 0.8271
0.0006 104.0 25376 1.6218 0.6412 0.7018 0.6701 0.8251
0.0008 105.0 25620 1.6308 0.6475 0.7024 0.6738 0.8245
0.0008 106.0 25864 1.6342 0.6471 0.7066 0.6756 0.8267
0.0004 107.0 26108 1.6346 0.6447 0.7058 0.6739 0.8254
0.0004 108.0 26352 1.6328 0.6437 0.7066 0.6737 0.8257
0.0008 109.0 26596 1.6220 0.6476 0.7038 0.6745 0.8257
0.0008 110.0 26840 1.6160 0.6525 0.7066 0.6785 0.8276
0.0006 111.0 27084 1.6100 0.6455 0.7055 0.6741 0.8270
0.0006 112.0 27328 1.6270 0.6394 0.7055 0.6708 0.8247
0.0005 113.0 27572 1.6234 0.6505 0.7024 0.6754 0.8273
0.0005 114.0 27816 1.6328 0.6417 0.7035 0.6712 0.8252
0.0004 115.0 28060 1.6352 0.6428 0.7018 0.6710 0.8251
0.0004 116.0 28304 1.6269 0.6458 0.7055 0.6743 0.8265
0.0005 117.0 28548 1.6377 0.6442 0.7041 0.6728 0.8253
0.0005 118.0 28792 1.6353 0.6450 0.7049 0.6736 0.8257
0.0004 119.0 29036 1.6395 0.6476 0.7044 0.6748 0.8257
0.0004 120.0 29280 1.6385 0.6467 0.7038 0.6741 0.8256

Framework versions

  • Transformers 4.32.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 3.1.0
  • Tokenizers 0.13.3
Downloads last month
1
Inference API
Unable to determine this model's library. Check the docs .

Model tree for hts98/ner_base

Base model

vinai/phobert-base
Finetuned
(35)
this model

Dataset used to train hts98/ner_base

Evaluation results