Edit model card

nerui-seq_bn-4

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0476
  • Location Precision: 0.8899
  • Location Recall: 0.9417
  • Location F1: 0.9151
  • Location Number: 103
  • Organization Precision: 0.9006
  • Organization Recall: 0.9006
  • Organization F1: 0.9006
  • Organization Number: 171
  • Person Precision: 0.9697
  • Person Recall: 0.9771
  • Person F1: 0.9734
  • Person Number: 131
  • Overall Precision: 0.9199
  • Overall Recall: 0.9358
  • Overall F1: 0.9278
  • Overall Accuracy: 0.9859

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8449 1.0 96 0.5421 0.0 0.0 0.0 103 0.0 0.0 0.0 171 0.0 0.0 0.0 131 0.0 0.0 0.0 0.8379
0.4618 2.0 192 0.3207 0.4444 0.1553 0.2302 103 0.3839 0.5029 0.4354 171 0.3259 0.5573 0.4113 131 0.3616 0.4321 0.3937 0.8995
0.3133 3.0 288 0.2317 0.4598 0.3883 0.4211 103 0.5872 0.7485 0.6581 171 0.5882 0.7634 0.6645 131 0.5642 0.6617 0.6091 0.9362
0.219 4.0 384 0.1434 0.5833 0.6117 0.5972 103 0.7208 0.8304 0.7717 171 0.8435 0.9466 0.8921 131 0.7279 0.8123 0.7678 0.9624
0.1518 5.0 480 0.1125 0.7368 0.8155 0.7742 103 0.7841 0.8070 0.7954 171 0.9044 0.9389 0.9213 131 0.8099 0.8519 0.8303 0.9699
0.1247 6.0 576 0.0863 0.7739 0.8641 0.8165 103 0.8372 0.8421 0.8397 171 0.9333 0.9618 0.9474 131 0.8507 0.8864 0.8682 0.9743
0.1071 7.0 672 0.0788 0.8230 0.9029 0.8611 103 0.8409 0.8655 0.8530 171 0.9137 0.9695 0.9407 131 0.8598 0.9086 0.8836 0.9760
0.0973 8.0 768 0.0731 0.8034 0.9126 0.8545 103 0.8512 0.8363 0.8437 171 0.9478 0.9695 0.9585 131 0.8687 0.8988 0.8835 0.9782
0.0908 9.0 864 0.0648 0.8727 0.9320 0.9014 103 0.8743 0.8947 0.8844 171 0.9338 0.9695 0.9513 131 0.8931 0.9284 0.9104 0.9804
0.0806 10.0 960 0.0618 0.8991 0.9515 0.9245 103 0.8793 0.8947 0.8870 171 0.9478 0.9695 0.9585 131 0.9065 0.9333 0.9197 0.9815
0.0744 11.0 1056 0.0592 0.9048 0.9223 0.9135 103 0.8652 0.9006 0.8825 171 0.9549 0.9695 0.9621 131 0.9038 0.9284 0.9160 0.9820
0.0725 12.0 1152 0.0561 0.9159 0.9515 0.9333 103 0.8715 0.9123 0.8914 171 0.9549 0.9695 0.9621 131 0.9093 0.9407 0.9248 0.9823
0.0689 13.0 1248 0.0552 0.9143 0.9320 0.9231 103 0.9048 0.8889 0.8968 171 0.9478 0.9695 0.9585 131 0.9214 0.9259 0.9236 0.9829
0.0648 14.0 1344 0.0568 0.8727 0.9320 0.9014 103 0.9051 0.8363 0.8693 171 0.9478 0.9695 0.9585 131 0.9104 0.9037 0.9071 0.9812
0.0606 15.0 1440 0.0477 0.9320 0.9320 0.9320 103 0.9053 0.8947 0.9000 171 0.9549 0.9695 0.9621 131 0.9284 0.9284 0.9284 0.9851
0.0559 16.0 1536 0.0462 0.8981 0.9417 0.9194 103 0.8988 0.8830 0.8909 171 0.9549 0.9695 0.9621 131 0.9169 0.9259 0.9214 0.9840
0.056 17.0 1632 0.0483 0.9495 0.9126 0.9307 103 0.8927 0.9240 0.9080 171 0.9478 0.9695 0.9585 131 0.9244 0.9358 0.9301 0.9845
0.0518 18.0 1728 0.0455 0.8981 0.9417 0.9194 103 0.9112 0.9006 0.9059 171 0.9621 0.9695 0.9658 131 0.9242 0.9333 0.9287 0.9848
0.049 19.0 1824 0.0502 0.9057 0.9320 0.9187 103 0.9212 0.8889 0.9048 171 0.9478 0.9695 0.9585 131 0.9259 0.9259 0.9259 0.9834
0.0495 20.0 1920 0.0447 0.9151 0.9417 0.9282 103 0.88 0.9006 0.8902 171 0.9549 0.9695 0.9621 131 0.9130 0.9333 0.9231 0.9851
0.0444 21.0 2016 0.0477 0.8991 0.9515 0.9245 103 0.9096 0.8830 0.8961 171 0.9549 0.9695 0.9621 131 0.9216 0.9284 0.9250 0.9843
0.0432 22.0 2112 0.0419 0.9151 0.9417 0.9282 103 0.8807 0.9064 0.8934 171 0.9697 0.9771 0.9734 131 0.9179 0.9383 0.9280 0.9856
0.0419 23.0 2208 0.0434 0.8991 0.9515 0.9245 103 0.8941 0.8889 0.8915 171 0.9549 0.9695 0.9621 131 0.9150 0.9309 0.9229 0.9854
0.0401 24.0 2304 0.0464 0.875 0.9515 0.9116 103 0.8830 0.8830 0.8830 171 0.9549 0.9695 0.9621 131 0.9038 0.9284 0.9160 0.9845
0.0388 25.0 2400 0.0469 0.8909 0.9515 0.9202 103 0.9091 0.8772 0.8929 171 0.9549 0.9695 0.9621 131 0.9191 0.9259 0.9225 0.9837
0.0364 26.0 2496 0.0399 0.9238 0.9417 0.9327 103 0.9012 0.9064 0.9038 171 0.9697 0.9771 0.9734 131 0.9291 0.9383 0.9337 0.9867
0.036 27.0 2592 0.0420 0.9245 0.9515 0.9378 103 0.9006 0.9006 0.9006 171 0.9549 0.9695 0.9621 131 0.9244 0.9358 0.9301 0.9859
0.0349 28.0 2688 0.0393 0.9151 0.9417 0.9282 103 0.8920 0.9181 0.9049 171 0.9549 0.9695 0.9621 131 0.9181 0.9407 0.9293 0.9862
0.0328 29.0 2784 0.0462 0.9151 0.9417 0.9282 103 0.9107 0.8947 0.9027 171 0.9549 0.9695 0.9621 131 0.9263 0.9309 0.9286 0.9843
0.0334 30.0 2880 0.0413 0.9238 0.9417 0.9327 103 0.8983 0.9298 0.9138 171 0.9697 0.9771 0.9734 131 0.9275 0.9481 0.9377 0.9862
0.0334 31.0 2976 0.0401 0.8889 0.9320 0.9100 103 0.8960 0.9064 0.9012 171 0.9697 0.9771 0.9734 131 0.9177 0.9358 0.9267 0.9854
0.0311 32.0 3072 0.0398 0.8919 0.9612 0.9252 103 0.9112 0.9006 0.9059 171 0.9697 0.9771 0.9734 131 0.9248 0.9407 0.9327 0.9862
0.0281 33.0 3168 0.0389 0.8909 0.9515 0.9202 103 0.9059 0.9006 0.9032 171 0.9697 0.9771 0.9734 131 0.9223 0.9383 0.9302 0.9865
0.0326 34.0 3264 0.0407 0.9065 0.9417 0.9238 103 0.8908 0.9064 0.8986 171 0.9697 0.9771 0.9734 131 0.9201 0.9383 0.9291 0.9862
0.026 35.0 3360 0.0437 0.8909 0.9515 0.9202 103 0.9102 0.8889 0.8994 171 0.9697 0.9771 0.9734 131 0.9242 0.9333 0.9287 0.9854
0.0278 36.0 3456 0.0416 0.8919 0.9612 0.9252 103 0.9222 0.9006 0.9112 171 0.9697 0.9771 0.9734 131 0.9293 0.9407 0.9350 0.9862
0.0263 37.0 3552 0.0406 0.9307 0.9126 0.9216 103 0.8883 0.9298 0.9086 171 0.9697 0.9771 0.9734 131 0.9248 0.9407 0.9327 0.9862
0.0241 38.0 3648 0.0433 0.8761 0.9612 0.9167 103 0.9112 0.9006 0.9059 171 0.9697 0.9771 0.9734 131 0.9203 0.9407 0.9304 0.9862
0.0231 39.0 3744 0.0409 0.8909 0.9515 0.9202 103 0.9172 0.9064 0.9118 171 0.9697 0.9771 0.9734 131 0.9270 0.9407 0.9338 0.9856
0.0229 40.0 3840 0.0437 0.9245 0.9515 0.9378 103 0.9128 0.9181 0.9155 171 0.9697 0.9771 0.9734 131 0.9341 0.9457 0.9399 0.9862
0.0221 41.0 3936 0.0396 0.9143 0.9320 0.9231 103 0.9017 0.9123 0.9070 171 0.9697 0.9771 0.9734 131 0.9268 0.9383 0.9325 0.9859
0.0229 42.0 4032 0.0428 0.8991 0.9515 0.9245 103 0.9157 0.8889 0.9021 171 0.9697 0.9771 0.9734 131 0.9287 0.9333 0.9310 0.9854
0.0225 43.0 4128 0.0408 0.8909 0.9515 0.9202 103 0.9172 0.9064 0.9118 171 0.9697 0.9771 0.9734 131 0.9270 0.9407 0.9338 0.9862
0.0217 44.0 4224 0.0491 0.8991 0.9515 0.9245 103 0.9317 0.8772 0.9036 171 0.9697 0.9771 0.9734 131 0.9353 0.9284 0.9318 0.9851
0.0209 45.0 4320 0.0460 0.8761 0.9612 0.9167 103 0.9157 0.8889 0.9021 171 0.9697 0.9771 0.9734 131 0.9221 0.9358 0.9289 0.9854
0.0196 46.0 4416 0.0433 0.8839 0.9612 0.9209 103 0.9123 0.9123 0.9123 171 0.9697 0.9771 0.9734 131 0.9229 0.9457 0.9341 0.9865
0.0201 47.0 4512 0.0405 0.8962 0.9223 0.9091 103 0.8857 0.9064 0.8960 171 0.9697 0.9771 0.9734 131 0.9153 0.9333 0.9242 0.9862
0.0187 48.0 4608 0.0459 0.8919 0.9612 0.9252 103 0.9222 0.9006 0.9112 171 0.9697 0.9771 0.9734 131 0.9293 0.9407 0.9350 0.9865
0.0172 49.0 4704 0.0445 0.8889 0.9320 0.9100 103 0.9012 0.9064 0.9038 171 0.9697 0.9771 0.9734 131 0.9199 0.9358 0.9278 0.9854
0.0171 50.0 4800 0.0462 0.8889 0.9320 0.9100 103 0.9064 0.9064 0.9064 171 0.9697 0.9771 0.9734 131 0.9221 0.9358 0.9289 0.9856
0.018 51.0 4896 0.0490 0.9065 0.9417 0.9238 103 0.9337 0.9064 0.9199 171 0.9697 0.9771 0.9734 131 0.9383 0.9383 0.9383 0.9865
0.0165 52.0 4992 0.0449 0.8889 0.9320 0.9100 103 0.9064 0.9064 0.9064 171 0.9697 0.9771 0.9734 131 0.9221 0.9358 0.9289 0.9854
0.0156 53.0 5088 0.0445 0.8879 0.9223 0.9048 103 0.8960 0.9064 0.9012 171 0.9697 0.9771 0.9734 131 0.9175 0.9333 0.9253 0.9851
0.0159 54.0 5184 0.0444 0.9135 0.9223 0.9179 103 0.8977 0.9240 0.9107 171 0.9697 0.9771 0.9734 131 0.9248 0.9407 0.9327 0.9854
0.0171 55.0 5280 0.0511 0.8919 0.9612 0.9252 103 0.9207 0.8830 0.9015 171 0.9697 0.9771 0.9734 131 0.9287 0.9333 0.9310 0.9856
0.0174 56.0 5376 0.0471 0.9048 0.9223 0.9135 103 0.8864 0.9123 0.8991 171 0.9697 0.9771 0.9734 131 0.9177 0.9358 0.9267 0.9848
0.0157 57.0 5472 0.0476 0.8909 0.9515 0.9202 103 0.9017 0.9123 0.9070 171 0.9697 0.9771 0.9734 131 0.9205 0.9432 0.9317 0.9856
0.0162 58.0 5568 0.0458 0.8899 0.9417 0.9151 103 0.9064 0.9064 0.9064 171 0.9697 0.9771 0.9734 131 0.9223 0.9383 0.9302 0.9856
0.0141 59.0 5664 0.0440 0.9038 0.9126 0.9082 103 0.8971 0.9181 0.9075 171 0.9697 0.9771 0.9734 131 0.9221 0.9358 0.9289 0.9856
0.014 60.0 5760 0.0467 0.8899 0.9417 0.9151 103 0.9167 0.9006 0.9086 171 0.9697 0.9771 0.9734 131 0.9267 0.9358 0.9312 0.9856
0.014 61.0 5856 0.0497 0.8972 0.9320 0.9143 103 0.8895 0.8947 0.8921 171 0.9697 0.9771 0.9734 131 0.9173 0.9309 0.9240 0.9845
0.0139 62.0 5952 0.0457 0.8981 0.9417 0.9194 103 0.9012 0.9064 0.9038 171 0.9697 0.9771 0.9734 131 0.9223 0.9383 0.9302 0.9859
0.0142 63.0 6048 0.0467 0.8818 0.9417 0.9108 103 0.9222 0.9006 0.9112 171 0.9697 0.9771 0.9734 131 0.9267 0.9358 0.9312 0.9865
0.0137 64.0 6144 0.0475 0.8818 0.9417 0.9108 103 0.9162 0.8947 0.9053 171 0.9697 0.9771 0.9734 131 0.9242 0.9333 0.9287 0.9854
0.0129 65.0 6240 0.0478 0.8899 0.9417 0.9151 103 0.9222 0.9006 0.9112 171 0.9697 0.9771 0.9734 131 0.9289 0.9358 0.9323 0.9859
0.0119 66.0 6336 0.0467 0.8899 0.9417 0.9151 103 0.9006 0.9006 0.9006 171 0.9697 0.9771 0.9734 131 0.9199 0.9358 0.9278 0.9856
0.0132 67.0 6432 0.0486 0.8899 0.9417 0.9151 103 0.8895 0.8947 0.8921 171 0.9697 0.9771 0.9734 131 0.9153 0.9333 0.9242 0.9845
0.0137 68.0 6528 0.0449 0.8818 0.9417 0.9108 103 0.9064 0.9064 0.9064 171 0.9697 0.9771 0.9734 131 0.9201 0.9383 0.9291 0.9854
0.0121 69.0 6624 0.0459 0.8981 0.9417 0.9194 103 0.8851 0.9006 0.8928 171 0.9697 0.9771 0.9734 131 0.9155 0.9358 0.9255 0.9848
0.012 70.0 6720 0.0467 0.8899 0.9417 0.9151 103 0.9107 0.8947 0.9027 171 0.9697 0.9771 0.9734 131 0.9242 0.9333 0.9287 0.9859
0.0116 71.0 6816 0.0449 0.9057 0.9320 0.9187 103 0.9023 0.9181 0.9101 171 0.9697 0.9771 0.9734 131 0.9248 0.9407 0.9327 0.9862
0.012 72.0 6912 0.0516 0.8909 0.9515 0.9202 103 0.9152 0.8830 0.8988 171 0.9697 0.9771 0.9734 131 0.9263 0.9309 0.9286 0.9851
0.0116 73.0 7008 0.0487 0.8899 0.9417 0.9151 103 0.8953 0.9006 0.8980 171 0.9697 0.9771 0.9734 131 0.9177 0.9358 0.9267 0.9845
0.0118 74.0 7104 0.0488 0.8981 0.9417 0.9194 103 0.9012 0.9064 0.9038 171 0.9697 0.9771 0.9734 131 0.9223 0.9383 0.9302 0.9848
0.0126 75.0 7200 0.0507 0.8818 0.9417 0.9108 103 0.9217 0.8947 0.9080 171 0.9697 0.9771 0.9734 131 0.9265 0.9333 0.9299 0.9848
0.0111 76.0 7296 0.0514 0.8899 0.9417 0.9151 103 0.9112 0.9006 0.9059 171 0.9697 0.9771 0.9734 131 0.9244 0.9358 0.9301 0.9851
0.0107 77.0 7392 0.0541 0.8991 0.9515 0.9245 103 0.8922 0.8713 0.8817 171 0.9697 0.9771 0.9734 131 0.9191 0.9259 0.9225 0.9851
0.0109 78.0 7488 0.0491 0.8899 0.9417 0.9151 103 0.8953 0.9006 0.8980 171 0.9697 0.9771 0.9734 131 0.9177 0.9358 0.9267 0.9845
0.0107 79.0 7584 0.0529 0.8818 0.9417 0.9108 103 0.9048 0.8889 0.8968 171 0.9697 0.9771 0.9734 131 0.9195 0.9309 0.9252 0.9845
0.0099 80.0 7680 0.0501 0.8899 0.9417 0.9151 103 0.8895 0.8947 0.8921 171 0.9697 0.9771 0.9734 131 0.9153 0.9333 0.9242 0.9843
0.0109 81.0 7776 0.0484 0.8899 0.9417 0.9151 103 0.9006 0.9006 0.9006 171 0.9697 0.9771 0.9734 131 0.9199 0.9358 0.9278 0.9851
0.011 82.0 7872 0.0503 0.8899 0.9417 0.9151 103 0.9064 0.9064 0.9064 171 0.9697 0.9771 0.9734 131 0.9223 0.9383 0.9302 0.9851
0.0119 83.0 7968 0.0488 0.8972 0.9320 0.9143 103 0.9112 0.9006 0.9059 171 0.9697 0.9771 0.9734 131 0.9265 0.9333 0.9299 0.9859
0.0102 84.0 8064 0.0498 0.8818 0.9417 0.9108 103 0.9226 0.9064 0.9145 171 0.9697 0.9771 0.9734 131 0.9268 0.9383 0.9325 0.9865
0.012 85.0 8160 0.0485 0.8991 0.9515 0.9245 103 0.9172 0.9064 0.9118 171 0.9697 0.9771 0.9734 131 0.9293 0.9407 0.9350 0.9865
0.0105 86.0 8256 0.0489 0.8829 0.9515 0.9159 103 0.9157 0.8889 0.9021 171 0.9697 0.9771 0.9734 131 0.9242 0.9333 0.9287 0.9854
0.0113 87.0 8352 0.0481 0.8981 0.9417 0.9194 103 0.9226 0.9064 0.9145 171 0.9697 0.9771 0.9734 131 0.9314 0.9383 0.9348 0.9865
0.0106 88.0 8448 0.0477 0.8909 0.9515 0.9202 103 0.9172 0.9064 0.9118 171 0.9697 0.9771 0.9734 131 0.9270 0.9407 0.9338 0.9865
0.0108 89.0 8544 0.0473 0.8909 0.9515 0.9202 103 0.9118 0.9064 0.9091 171 0.9697 0.9771 0.9734 131 0.9248 0.9407 0.9327 0.9859
0.0107 90.0 8640 0.0482 0.8981 0.9417 0.9194 103 0.9118 0.9064 0.9091 171 0.9697 0.9771 0.9734 131 0.9268 0.9383 0.9325 0.9859
0.0097 91.0 8736 0.0480 0.8991 0.9515 0.9245 103 0.9006 0.9006 0.9006 171 0.9697 0.9771 0.9734 131 0.9223 0.9383 0.9302 0.9865
0.0104 92.0 8832 0.0477 0.8899 0.9417 0.9151 103 0.9176 0.9123 0.9150 171 0.9697 0.9771 0.9734 131 0.9270 0.9407 0.9338 0.9870
0.0101 93.0 8928 0.0479 0.8899 0.9417 0.9151 103 0.9176 0.9123 0.9150 171 0.9697 0.9771 0.9734 131 0.9270 0.9407 0.9338 0.9867
0.0099 94.0 9024 0.0488 0.8818 0.9417 0.9108 103 0.9107 0.8947 0.9027 171 0.9697 0.9771 0.9734 131 0.9220 0.9333 0.9276 0.9856
0.0098 95.0 9120 0.0473 0.8899 0.9417 0.9151 103 0.9118 0.9064 0.9091 171 0.9697 0.9771 0.9734 131 0.9246 0.9383 0.9314 0.9867
0.0085 96.0 9216 0.0475 0.8899 0.9417 0.9151 103 0.9118 0.9064 0.9091 171 0.9697 0.9771 0.9734 131 0.9246 0.9383 0.9314 0.9867
0.0096 97.0 9312 0.0476 0.8899 0.9417 0.9151 103 0.9006 0.9006 0.9006 171 0.9697 0.9771 0.9734 131 0.9199 0.9358 0.9278 0.9862
0.0097 98.0 9408 0.0477 0.8899 0.9417 0.9151 103 0.9006 0.9006 0.9006 171 0.9697 0.9771 0.9734 131 0.9199 0.9358 0.9278 0.9859
0.0087 99.0 9504 0.0478 0.8899 0.9417 0.9151 103 0.9006 0.9006 0.9006 171 0.9697 0.9771 0.9734 131 0.9199 0.9358 0.9278 0.9859
0.0085 100.0 9600 0.0476 0.8899 0.9417 0.9151 103 0.9006 0.9006 0.9006 171 0.9697 0.9771 0.9734 131 0.9199 0.9358 0.9278 0.9859

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-seq_bn-4

Finetuned
(366)
this model