ner / README.md
hts98's picture
update model card README.md
506f656
metadata
license: mit
base_model: vinai/phobert-large
tags:
  - generated_from_trainer
datasets:
  - hts98/UIT
metrics:
  - precision
  - recall
  - f1
  - accuracy
model-index:
  - name: ner
    results:
      - task:
          name: Token Classification
          type: token-classification
        dataset:
          name: hts98/UIT
          type: hts98/UIT
        metrics:
          - name: Precision
            type: precision
            value: 0.6879356568364611
          - name: Recall
            type: recall
            value: 0.7163595756560581
          - name: F1
            type: f1
            value: 0.7018599562363238
          - name: Accuracy
            type: accuracy
            value: 0.8296796355048782

ner

This model is a fine-tuned version of vinai/phobert-large on the hts98/UIT dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8209
  • Precision: 0.6879
  • Recall: 0.7164
  • F1: 0.7019
  • Accuracy: 0.8297

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 120.0

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 487 0.6517 0.5180 0.6267 0.5672 0.7979
1.0091 2.0 974 0.6198 0.5438 0.6583 0.5956 0.8116
0.5042 3.0 1461 0.6614 0.5677 0.6745 0.6165 0.8094
0.3543 4.0 1948 0.6794 0.5792 0.6826 0.6267 0.8231
0.2476 5.0 2435 0.7132 0.6090 0.7041 0.6531 0.8249
0.1849 6.0 2922 0.7761 0.6023 0.6926 0.6443 0.8219
0.1465 7.0 3409 0.8294 0.5965 0.7007 0.6444 0.8173
0.1176 8.0 3896 0.8653 0.6150 0.6935 0.6519 0.8230
0.1023 9.0 4383 0.8614 0.6123 0.6926 0.6500 0.8226
0.0823 10.0 4870 0.9825 0.6073 0.6848 0.6437 0.8216
0.069 11.0 5357 0.9783 0.6246 0.6957 0.6582 0.8248
0.0578 12.0 5844 1.0037 0.6115 0.7030 0.6540 0.8224
0.0522 13.0 6331 1.0799 0.6177 0.6829 0.6486 0.8161
0.0461 14.0 6818 1.0693 0.6088 0.7016 0.6519 0.8203
0.0402 15.0 7305 1.0560 0.6158 0.6991 0.6548 0.8230
0.0369 16.0 7792 1.1046 0.6307 0.6910 0.6595 0.8197
0.0391 17.0 8279 1.1480 0.6228 0.6873 0.6535 0.8233
0.0537 18.0 8766 1.2141 0.6234 0.6873 0.6538 0.8204
0.0497 19.0 9253 1.2230 0.6241 0.6957 0.6580 0.8189
0.0512 20.0 9740 1.2078 0.6357 0.7016 0.6670 0.8268
0.0508 21.0 10227 1.1941 0.6153 0.6921 0.6514 0.8178
0.044 22.0 10714 1.3114 0.6377 0.6924 0.6639 0.8161
0.041 23.0 11201 1.2640 0.6191 0.6884 0.6519 0.8165
0.0216 24.0 11688 1.3127 0.6349 0.6929 0.6627 0.8240
0.0187 25.0 12175 1.3329 0.6452 0.7004 0.6717 0.8229
0.0158 26.0 12662 1.2958 0.6243 0.7004 0.6602 0.8177
0.0151 27.0 13149 1.3276 0.6204 0.6985 0.6571 0.8181
0.016 28.0 13636 1.2671 0.6481 0.6999 0.6730 0.8251
0.0157 29.0 14123 1.3374 0.6191 0.6946 0.6547 0.8204
0.0146 30.0 14610 1.3941 0.6558 0.6932 0.6740 0.8192
0.0134 31.0 15097 1.4215 0.6344 0.6854 0.6589 0.8164
0.0146 32.0 15584 1.4602 0.6510 0.6937 0.6717 0.8198
0.0105 33.0 16071 1.4085 0.6459 0.7038 0.6736 0.8240
0.0135 34.0 16558 1.3593 0.6337 0.7002 0.6653 0.8166
0.0155 35.0 17045 1.3412 0.6519 0.6943 0.6724 0.8222
0.0141 36.0 17532 1.3676 0.6385 0.7021 0.6688 0.8219
0.0145 37.0 18019 1.3878 0.6573 0.6993 0.6777 0.8251
0.0106 38.0 18506 1.4314 0.6298 0.7016 0.6638 0.8239
0.0106 39.0 18993 1.3729 0.6666 0.7071 0.6863 0.8282
0.0112 40.0 19480 1.3455 0.6506 0.7032 0.6759 0.8283
0.0109 41.0 19967 1.3884 0.6429 0.7060 0.6730 0.8278
0.0084 42.0 20454 1.4240 0.6428 0.7080 0.6738 0.8255
0.0082 43.0 20941 1.4280 0.6091 0.6829 0.6439 0.8176
0.0122 44.0 21428 1.4723 0.6533 0.7032 0.6773 0.8239
0.0082 45.0 21915 1.5151 0.6189 0.6960 0.6552 0.8180
0.0068 46.0 22402 1.4441 0.6331 0.7046 0.6669 0.8211
0.0074 47.0 22889 1.4753 0.6497 0.6974 0.6727 0.8203
0.0076 48.0 23376 1.5148 0.6515 0.6957 0.6729 0.8215
0.0098 49.0 23863 1.4481 0.6319 0.6974 0.6630 0.8233
0.0104 50.0 24350 1.4814 0.6585 0.7074 0.6821 0.8235
0.0119 51.0 24837 1.4050 0.6555 0.7133 0.6832 0.8264
0.0078 52.0 25324 1.4854 0.6615 0.7049 0.6825 0.8234
0.007 53.0 25811 1.4941 0.6476 0.7013 0.6734 0.8204
0.0079 54.0 26298 1.4138 0.6529 0.7088 0.6797 0.8228
0.0092 55.0 26785 1.4301 0.6762 0.7018 0.6888 0.8218
0.0097 56.0 27272 1.5276 0.6544 0.6974 0.6752 0.8182
0.0076 57.0 27759 1.4302 0.6517 0.7032 0.6765 0.8258
0.0056 58.0 28246 1.4996 0.6675 0.7046 0.6856 0.8265
0.0047 59.0 28733 1.4309 0.6625 0.7032 0.6823 0.8241
0.0126 60.0 29220 1.4903 0.6457 0.7002 0.6718 0.8172
0.0054 61.0 29707 1.4318 0.6398 0.7035 0.6701 0.8218
0.0076 62.0 30194 1.5745 0.6660 0.6988 0.6820 0.8196
0.0043 63.0 30681 1.5102 0.6607 0.7058 0.6825 0.8268
0.0046 64.0 31168 1.5500 0.6655 0.6949 0.6799 0.8252
0.0042 65.0 31655 1.5357 0.6555 0.7138 0.6834 0.8274
0.0039 66.0 32142 1.5469 0.6650 0.7105 0.6870 0.8220
0.004 67.0 32629 1.4814 0.6542 0.7147 0.6831 0.8289
0.0031 68.0 33116 1.5210 0.6545 0.7097 0.6810 0.8250
0.0047 69.0 33603 1.5326 0.6549 0.7083 0.6805 0.8272
0.0029 70.0 34090 1.6057 0.6643 0.7027 0.6829 0.8226
0.0027 71.0 34577 1.5920 0.6594 0.7141 0.6857 0.8255
0.0016 72.0 35064 1.6220 0.6668 0.7024 0.6842 0.8255
0.0025 73.0 35551 1.6261 0.6803 0.7027 0.6913 0.8239
0.0037 74.0 36038 1.6440 0.6769 0.7049 0.6906 0.8207
0.003 75.0 36525 1.6027 0.6701 0.7071 0.6881 0.8263
0.0031 76.0 37012 1.6013 0.6670 0.7141 0.6898 0.8262
0.0031 77.0 37499 1.6714 0.6434 0.7147 0.6772 0.8185
0.002 78.0 37986 1.6293 0.6666 0.7071 0.6863 0.8267
0.0024 79.0 38473 1.6796 0.6578 0.7094 0.6826 0.8222
0.003 80.0 38960 1.6463 0.6701 0.7094 0.6892 0.8283
0.0015 81.0 39447 1.6634 0.6765 0.7074 0.6916 0.8266
0.003 82.0 39934 1.6947 0.6636 0.7055 0.6839 0.8255
0.0036 83.0 40421 1.6515 0.6554 0.7046 0.6791 0.8227
0.0018 84.0 40908 1.6855 0.6641 0.7102 0.6864 0.8266
0.0012 85.0 41395 1.6966 0.6545 0.7108 0.6815 0.8241
0.0019 86.0 41882 1.6564 0.6623 0.7058 0.6833 0.8255
0.0015 87.0 42369 1.6363 0.6501 0.7080 0.6778 0.8239
0.0022 88.0 42856 1.6879 0.6813 0.7055 0.6932 0.8260
0.0011 89.0 43343 1.6870 0.6660 0.7113 0.6879 0.8294
0.0017 90.0 43830 1.7018 0.6707 0.7041 0.6870 0.8276
0.0016 91.0 44317 1.6699 0.6701 0.7133 0.6910 0.8281
0.0015 92.0 44804 1.6737 0.6773 0.7125 0.6944 0.8320
0.0017 93.0 45291 1.7271 0.6769 0.7189 0.6973 0.8280
0.0005 94.0 45778 1.7245 0.6654 0.7127 0.6882 0.8261
0.0013 95.0 46265 1.8143 0.6772 0.7052 0.6909 0.8235
0.0012 96.0 46752 1.7299 0.6736 0.7091 0.6909 0.8262
0.002 97.0 47239 1.7251 0.6758 0.7125 0.6937 0.8273
0.0009 98.0 47726 1.7183 0.6565 0.7183 0.6860 0.8262
0.0009 99.0 48213 1.7801 0.6759 0.7116 0.6933 0.8279
0.0008 100.0 48700 1.7749 0.6817 0.7108 0.6959 0.8263
0.0006 101.0 49187 1.7413 0.6732 0.7113 0.6917 0.8272
0.0005 102.0 49674 1.7939 0.6648 0.7144 0.6887 0.8270
0.0008 103.0 50161 1.7955 0.6602 0.7111 0.6847 0.8237
0.0007 104.0 50648 1.7844 0.6686 0.7130 0.6901 0.8266
0.0005 105.0 51135 1.7983 0.6808 0.7127 0.6964 0.8279
0.0004 106.0 51622 1.7945 0.6798 0.7130 0.6960 0.8256
0.0005 107.0 52109 1.8209 0.6879 0.7164 0.7019 0.8297
0.0004 108.0 52596 1.8150 0.6839 0.7085 0.6960 0.8281
0.0006 109.0 53083 1.7784 0.6778 0.7166 0.6967 0.8287
0.0009 110.0 53570 1.7941 0.6761 0.7180 0.6965 0.8293
0.0006 111.0 54057 1.8079 0.6762 0.7200 0.6974 0.8280
0.0006 112.0 54544 1.7968 0.6752 0.7166 0.6953 0.8277
0.0003 113.0 55031 1.7972 0.6753 0.7166 0.6954 0.8285
0.0003 114.0 55518 1.7985 0.6764 0.7172 0.6962 0.8298
0.0006 115.0 56005 1.8048 0.6759 0.7172 0.6959 0.8287
0.0006 116.0 56492 1.7985 0.6758 0.7152 0.6950 0.8298
0.0004 117.0 56979 1.7883 0.6835 0.7164 0.6996 0.8314
0.0009 118.0 57466 1.7852 0.6830 0.7180 0.7001 0.8311
0.0002 119.0 57953 1.7869 0.6853 0.7180 0.7013 0.8309
0.0003 120.0 58440 1.7865 0.6846 0.7180 0.7009 0.8312

Framework versions

  • Transformers 4.32.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 3.1.0
  • Tokenizers 0.13.3