Edit model card

nerui-seq_bn-2

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0508
  • Location Precision: 0.8835
  • Location Recall: 0.9785
  • Location F1: 0.9286
  • Location Number: 93
  • Organization Precision: 0.9268
  • Organization Recall: 0.9157
  • Organization F1: 0.9212
  • Organization Number: 166
  • Person Precision: 0.9789
  • Person Recall: 0.9789
  • Person F1: 0.9789
  • Person Number: 142
  • Overall Precision: 0.9340
  • Overall Recall: 0.9526
  • Overall F1: 0.9432
  • Overall Accuracy: 0.9877

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8344 1.0 96 0.5347 0.0 0.0 0.0 93 0.1667 0.0060 0.0116 166 0.0 0.0 0.0 142 0.1111 0.0025 0.0049 0.8348
0.4476 2.0 192 0.3089 0.3061 0.1613 0.2113 93 0.3575 0.4759 0.4083 166 0.3249 0.5423 0.4063 142 0.3373 0.4264 0.3767 0.9001
0.2971 3.0 288 0.2109 0.3962 0.4516 0.4221 93 0.5707 0.7048 0.6307 166 0.6630 0.8451 0.7430 142 0.5671 0.6958 0.6249 0.9438
0.21 4.0 384 0.1358 0.5741 0.6667 0.6169 93 0.6935 0.7771 0.7330 166 0.8701 0.9437 0.9054 142 0.7254 0.8105 0.7656 0.9627
0.1444 5.0 480 0.0974 0.7064 0.8280 0.7624 93 0.7545 0.7590 0.7568 166 0.9054 0.9437 0.9241 142 0.7948 0.8404 0.8170 0.9701
0.1206 6.0 576 0.0884 0.7456 0.9140 0.8213 93 0.7989 0.8373 0.8176 166 0.9448 0.9648 0.9547 142 0.8337 0.9002 0.8657 0.9739
0.1044 7.0 672 0.0804 0.7748 0.9247 0.8431 93 0.8056 0.8735 0.8382 166 0.9388 0.9718 0.9550 142 0.8425 0.9202 0.8796 0.9761
0.0937 8.0 768 0.0711 0.7727 0.9140 0.8374 93 0.8352 0.8855 0.8596 166 0.9384 0.9648 0.9514 142 0.8542 0.9202 0.8860 0.9786
0.0906 9.0 864 0.0635 0.8365 0.9355 0.8832 93 0.8295 0.8795 0.8538 166 0.9388 0.9718 0.9550 142 0.8689 0.9252 0.8961 0.9805
0.086 10.0 960 0.0629 0.8190 0.9247 0.8687 93 0.8232 0.8976 0.8588 166 0.9257 0.9648 0.9448 142 0.8571 0.9277 0.8910 0.9802
0.0759 11.0 1056 0.0562 0.8969 0.9355 0.9158 93 0.8278 0.8976 0.8613 166 0.9320 0.9648 0.9481 142 0.8797 0.9302 0.9042 0.9808
0.071 12.0 1152 0.0537 0.8673 0.9140 0.8901 93 0.8361 0.9217 0.8768 166 0.9384 0.9648 0.9514 142 0.8782 0.9352 0.9058 0.9819
0.0669 13.0 1248 0.0504 0.88 0.9462 0.9119 93 0.8765 0.8976 0.8869 166 0.9384 0.9648 0.9514 142 0.8990 0.9327 0.9155 0.9841
0.0639 14.0 1344 0.0516 0.8462 0.9462 0.8934 93 0.8810 0.8916 0.8862 166 0.9517 0.9718 0.9617 142 0.8969 0.9327 0.9144 0.9833
0.061 15.0 1440 0.0463 0.8788 0.9355 0.9062 93 0.8678 0.9096 0.8882 166 0.9583 0.9718 0.9650 142 0.9017 0.9377 0.9193 0.9855
0.0594 16.0 1536 0.0477 0.8627 0.9462 0.9026 93 0.8678 0.9096 0.8882 166 0.9650 0.9718 0.9684 142 0.8998 0.9401 0.9195 0.9852
0.0567 17.0 1632 0.0467 0.8627 0.9462 0.9026 93 0.8571 0.9036 0.8798 166 0.9650 0.9718 0.9684 142 0.8952 0.9377 0.9160 0.9844
0.0526 18.0 1728 0.0412 0.9263 0.9462 0.9362 93 0.8786 0.9157 0.8968 166 0.9650 0.9718 0.9684 142 0.9197 0.9426 0.9310 0.9871
0.05 19.0 1824 0.0427 0.8990 0.9570 0.9271 93 0.8935 0.9096 0.9015 166 0.9650 0.9718 0.9684 142 0.9197 0.9426 0.9310 0.9868
0.0474 20.0 1920 0.0438 0.8725 0.9570 0.9128 93 0.8701 0.9277 0.8980 166 0.9650 0.9718 0.9684 142 0.9028 0.9501 0.9259 0.9855
0.0472 21.0 2016 0.0415 0.9 0.9677 0.9326 93 0.8844 0.9217 0.9027 166 0.9650 0.9718 0.9684 142 0.9159 0.9501 0.9327 0.9874
0.0426 22.0 2112 0.0416 0.89 0.9570 0.9223 93 0.8686 0.9157 0.8915 166 0.9650 0.9718 0.9684 142 0.9067 0.9451 0.9255 0.9868
0.0422 23.0 2208 0.0421 0.8911 0.9677 0.9278 93 0.9036 0.9036 0.9036 166 0.9650 0.9718 0.9684 142 0.9220 0.9426 0.9322 0.9868
0.0418 24.0 2304 0.0450 0.8738 0.9677 0.9184 93 0.9096 0.9096 0.9096 166 0.9650 0.9718 0.9684 142 0.9199 0.9451 0.9323 0.9849
0.038 25.0 2400 0.0422 0.8812 0.9570 0.9175 93 0.8786 0.9157 0.8968 166 0.9650 0.9718 0.9684 142 0.9089 0.9451 0.9267 0.9852
0.037 26.0 2496 0.0401 0.8812 0.9570 0.9175 93 0.8686 0.9157 0.8915 166 0.9650 0.9718 0.9684 142 0.9045 0.9451 0.9244 0.9857
0.0346 27.0 2592 0.0395 0.8812 0.9570 0.9175 93 0.8935 0.9096 0.9015 166 0.9650 0.9718 0.9684 142 0.9153 0.9426 0.9287 0.9868
0.0363 28.0 2688 0.0427 0.8889 0.9462 0.9167 93 0.8539 0.9157 0.8837 166 0.9718 0.9718 0.9718 142 0.9021 0.9426 0.9220 0.9846
0.0366 29.0 2784 0.0419 0.8911 0.9677 0.9278 93 0.9048 0.9157 0.9102 166 0.9650 0.9718 0.9684 142 0.9223 0.9476 0.9348 0.9874
0.0323 30.0 2880 0.0411 0.89 0.9570 0.9223 93 0.8953 0.9277 0.9112 166 0.9722 0.9859 0.9790 142 0.9207 0.9551 0.9376 0.9857
0.0325 31.0 2976 0.0397 0.8824 0.9677 0.9231 93 0.8935 0.9096 0.9015 166 0.9650 0.9718 0.9684 142 0.9155 0.9451 0.9301 0.9871
0.0295 32.0 3072 0.0414 0.8812 0.9570 0.9175 93 0.8779 0.9096 0.8935 166 0.9718 0.9718 0.9718 142 0.9108 0.9426 0.9265 0.9857
0.0288 33.0 3168 0.0425 0.8641 0.9570 0.9082 93 0.8922 0.8976 0.8949 166 0.9650 0.9718 0.9684 142 0.9104 0.9377 0.9238 0.9844
0.0288 34.0 3264 0.0404 0.89 0.9570 0.9223 93 0.8793 0.9217 0.9 166 0.9789 0.9789 0.9789 142 0.9159 0.9501 0.9327 0.9857
0.0264 35.0 3360 0.0397 0.9082 0.9570 0.9319 93 0.8941 0.9157 0.9048 166 0.9650 0.9718 0.9684 142 0.9221 0.9451 0.9335 0.9868
0.028 36.0 3456 0.0431 0.8824 0.9677 0.9231 93 0.9198 0.8976 0.9085 166 0.9650 0.9718 0.9684 142 0.9263 0.9401 0.9332 0.9860
0.0245 37.0 3552 0.0393 0.8980 0.9462 0.9215 93 0.8895 0.9217 0.9053 166 0.9650 0.9718 0.9684 142 0.9177 0.9451 0.9312 0.9868
0.0238 38.0 3648 0.0424 0.9 0.9677 0.9326 93 0.9107 0.9217 0.9162 166 0.9650 0.9718 0.9684 142 0.9270 0.9501 0.9384 0.9866
0.0238 39.0 3744 0.0411 0.8812 0.9570 0.9175 93 0.9102 0.9157 0.9129 166 0.9650 0.9718 0.9684 142 0.9221 0.9451 0.9335 0.9871
0.0233 40.0 3840 0.0407 0.8990 0.9570 0.9271 93 0.9 0.9217 0.9107 166 0.9650 0.9718 0.9684 142 0.9223 0.9476 0.9348 0.9863
0.023 41.0 3936 0.0403 0.8889 0.9462 0.9167 93 0.8736 0.9157 0.8941 166 0.9789 0.9789 0.9789 142 0.9133 0.9451 0.9289 0.9863
0.0217 42.0 4032 0.0416 0.89 0.9570 0.9223 93 0.9036 0.9036 0.9036 166 0.9650 0.9718 0.9684 142 0.9218 0.9401 0.9309 0.9863
0.0213 43.0 4128 0.0427 0.89 0.9570 0.9223 93 0.9198 0.8976 0.9085 166 0.9650 0.9718 0.9684 142 0.9284 0.9377 0.9330 0.9855
0.0216 44.0 4224 0.0420 0.9091 0.9677 0.9375 93 0.9102 0.9157 0.9129 166 0.9650 0.9718 0.9684 142 0.9291 0.9476 0.9383 0.9871
0.02 45.0 4320 0.0422 0.9010 0.9785 0.9381 93 0.9207 0.9096 0.9152 166 0.9650 0.9718 0.9684 142 0.9314 0.9476 0.9394 0.9863
0.0187 46.0 4416 0.0419 0.9010 0.9785 0.9381 93 0.8982 0.9036 0.9009 166 0.9718 0.9718 0.9718 142 0.9244 0.9451 0.9346 0.9874
0.0195 47.0 4512 0.0416 0.8788 0.9355 0.9062 93 0.8837 0.9157 0.8994 166 0.9789 0.9789 0.9789 142 0.9153 0.9426 0.9287 0.9868
0.0183 48.0 4608 0.0421 0.8911 0.9677 0.9278 93 0.9157 0.9157 0.9157 166 0.9650 0.9718 0.9684 142 0.9268 0.9476 0.9371 0.9871
0.0182 49.0 4704 0.0412 0.8713 0.9462 0.9072 93 0.8935 0.9096 0.9015 166 0.9789 0.9789 0.9789 142 0.9175 0.9426 0.9299 0.9874
0.0154 50.0 4800 0.0424 0.8990 0.9570 0.9271 93 0.9162 0.9217 0.9189 166 0.9789 0.9789 0.9789 142 0.9338 0.9501 0.9419 0.9879
0.017 51.0 4896 0.0436 0.8725 0.9570 0.9128 93 0.9202 0.9036 0.9119 166 0.9789 0.9789 0.9789 142 0.9287 0.9426 0.9356 0.9868
0.0165 52.0 4992 0.0437 0.8738 0.9677 0.9184 93 0.9107 0.9217 0.9162 166 0.9789 0.9789 0.9789 142 0.9249 0.9526 0.9386 0.9874
0.0161 53.0 5088 0.0460 0.8990 0.9570 0.9271 93 0.9102 0.9157 0.9129 166 0.9789 0.9789 0.9789 142 0.9314 0.9476 0.9394 0.9874
0.0149 54.0 5184 0.0453 0.9 0.9677 0.9326 93 0.9112 0.9277 0.9194 166 0.9720 0.9789 0.9754 142 0.9296 0.9551 0.9422 0.9877
0.0149 55.0 5280 0.0462 0.8824 0.9677 0.9231 93 0.9317 0.9036 0.9174 166 0.9720 0.9789 0.9754 142 0.9335 0.9451 0.9393 0.9860
0.0155 56.0 5376 0.0445 0.8738 0.9677 0.9184 93 0.9102 0.9157 0.9129 166 0.9789 0.9789 0.9789 142 0.9248 0.9501 0.9373 0.9877
0.0151 57.0 5472 0.0473 0.8922 0.9785 0.9333 93 0.8982 0.9036 0.9009 166 0.9789 0.9789 0.9789 142 0.9246 0.9476 0.9360 0.9868
0.0164 58.0 5568 0.0492 0.8835 0.9785 0.9286 93 0.9102 0.9157 0.9129 166 0.9789 0.9789 0.9789 142 0.9272 0.9526 0.9397 0.9871
0.0144 59.0 5664 0.0464 0.9109 0.9892 0.9485 93 0.9268 0.9157 0.9212 166 0.9720 0.9789 0.9754 142 0.9387 0.9551 0.9468 0.9877
0.014 60.0 5760 0.0499 0.875 0.9785 0.9239 93 0.9367 0.8916 0.9136 166 0.9789 0.9789 0.9789 142 0.9356 0.9426 0.9391 0.9863
0.0137 61.0 5856 0.0452 0.9020 0.9892 0.9436 93 0.9152 0.9096 0.9124 166 0.9789 0.9789 0.9789 142 0.9340 0.9526 0.9432 0.9874
0.0132 62.0 5952 0.0468 0.9020 0.9892 0.9436 93 0.9207 0.9096 0.9152 166 0.9789 0.9789 0.9789 142 0.9363 0.9526 0.9444 0.9874
0.0125 63.0 6048 0.0456 0.8835 0.9785 0.9286 93 0.9048 0.9157 0.9102 166 0.9789 0.9789 0.9789 142 0.9249 0.9526 0.9386 0.9877
0.0131 64.0 6144 0.0464 0.9020 0.9892 0.9436 93 0.9317 0.9036 0.9174 166 0.9789 0.9789 0.9789 142 0.9407 0.9501 0.9454 0.9871
0.0119 65.0 6240 0.0464 0.9020 0.9892 0.9436 93 0.9259 0.9036 0.9146 166 0.9789 0.9789 0.9789 142 0.9384 0.9501 0.9442 0.9874
0.0135 66.0 6336 0.0454 0.8922 0.9785 0.9333 93 0.9202 0.9036 0.9119 166 0.9789 0.9789 0.9789 142 0.9337 0.9476 0.9406 0.9877
0.0115 67.0 6432 0.0471 0.9010 0.9785 0.9381 93 0.9152 0.9096 0.9124 166 0.9720 0.9789 0.9754 142 0.9315 0.9501 0.9407 0.9874
0.0124 68.0 6528 0.0468 0.8922 0.9785 0.9333 93 0.9207 0.9096 0.9152 166 0.9789 0.9789 0.9789 142 0.9338 0.9501 0.9419 0.9871
0.0124 69.0 6624 0.0464 0.8835 0.9785 0.9286 93 0.9107 0.9217 0.9162 166 0.9720 0.9789 0.9754 142 0.9251 0.9551 0.9399 0.9877
0.0107 70.0 6720 0.0470 0.8932 0.9892 0.9388 93 0.9325 0.9157 0.9240 166 0.9720 0.9789 0.9754 142 0.9364 0.9551 0.9457 0.9879
0.011 71.0 6816 0.0476 0.8922 0.9785 0.9333 93 0.9226 0.9337 0.9281 166 0.9720 0.9789 0.9754 142 0.9322 0.9601 0.9459 0.9885
0.0102 72.0 6912 0.0485 0.8932 0.9892 0.9388 93 0.9264 0.9096 0.9179 166 0.9789 0.9789 0.9789 142 0.9363 0.9526 0.9444 0.9877
0.0096 73.0 7008 0.0485 0.8932 0.9892 0.9388 93 0.9383 0.9157 0.9268 166 0.9789 0.9789 0.9789 142 0.9410 0.9551 0.9480 0.9877
0.0107 74.0 7104 0.0472 0.8835 0.9785 0.9286 93 0.9379 0.9096 0.9235 166 0.9789 0.9789 0.9789 142 0.9384 0.9501 0.9442 0.9874
0.0119 75.0 7200 0.0504 0.8824 0.9677 0.9231 93 0.9317 0.9036 0.9174 166 0.9789 0.9789 0.9789 142 0.9358 0.9451 0.9404 0.9871
0.0105 76.0 7296 0.0488 0.875 0.9785 0.9239 93 0.9264 0.9096 0.9179 166 0.9789 0.9789 0.9789 142 0.9315 0.9501 0.9407 0.9874
0.0102 77.0 7392 0.0504 0.8835 0.9785 0.9286 93 0.9273 0.9217 0.9245 166 0.9789 0.9789 0.9789 142 0.9341 0.9551 0.9445 0.9877
0.0098 78.0 7488 0.0484 0.9020 0.9892 0.9436 93 0.9321 0.9096 0.9207 166 0.9789 0.9789 0.9789 142 0.9409 0.9526 0.9467 0.9885
0.0089 79.0 7584 0.0468 0.8990 0.9570 0.9271 93 0.9207 0.9096 0.9152 166 0.9789 0.9789 0.9789 142 0.9358 0.9451 0.9404 0.9882
0.0113 80.0 7680 0.0493 0.8911 0.9677 0.9278 93 0.9152 0.9096 0.9124 166 0.9720 0.9789 0.9754 142 0.9291 0.9476 0.9383 0.9877
0.0093 81.0 7776 0.0489 0.8911 0.9677 0.9278 93 0.9217 0.9217 0.9217 166 0.9789 0.9789 0.9789 142 0.9340 0.9526 0.9432 0.9879
0.0094 82.0 7872 0.0519 0.8654 0.9677 0.9137 93 0.9136 0.8916 0.9024 166 0.9789 0.9789 0.9789 142 0.9240 0.9401 0.9320 0.9866
0.0104 83.0 7968 0.0508 0.8738 0.9677 0.9184 93 0.9379 0.9096 0.9235 166 0.9789 0.9789 0.9789 142 0.9360 0.9476 0.9418 0.9871
0.0098 84.0 8064 0.0511 0.875 0.9785 0.9239 93 0.9317 0.9036 0.9174 166 0.9789 0.9789 0.9789 142 0.9337 0.9476 0.9406 0.9877
0.0092 85.0 8160 0.0493 0.9020 0.9892 0.9436 93 0.9325 0.9157 0.9240 166 0.9789 0.9789 0.9789 142 0.9410 0.9551 0.9480 0.9888
0.0099 86.0 8256 0.0502 0.8824 0.9677 0.9231 93 0.9383 0.9157 0.9268 166 0.9789 0.9789 0.9789 142 0.9384 0.9501 0.9442 0.9879
0.0092 87.0 8352 0.0496 0.8738 0.9677 0.9184 93 0.9321 0.9096 0.9207 166 0.9789 0.9789 0.9789 142 0.9337 0.9476 0.9406 0.9874
0.0091 88.0 8448 0.0490 0.8835 0.9785 0.9286 93 0.9321 0.9096 0.9207 166 0.9789 0.9789 0.9789 142 0.9361 0.9501 0.9431 0.9877
0.0091 89.0 8544 0.0497 0.8835 0.9785 0.9286 93 0.9264 0.9096 0.9179 166 0.9789 0.9789 0.9789 142 0.9338 0.9501 0.9419 0.9874
0.0093 90.0 8640 0.0487 0.9020 0.9892 0.9436 93 0.9264 0.9096 0.9179 166 0.9789 0.9789 0.9789 142 0.9386 0.9526 0.9455 0.9879
0.0096 91.0 8736 0.0500 0.8922 0.9785 0.9333 93 0.9321 0.9096 0.9207 166 0.9789 0.9789 0.9789 142 0.9384 0.9501 0.9442 0.9879
0.0088 92.0 8832 0.0506 0.8824 0.9677 0.9231 93 0.9259 0.9036 0.9146 166 0.9789 0.9789 0.9789 142 0.9335 0.9451 0.9393 0.9874
0.0088 93.0 8928 0.0510 0.8835 0.9785 0.9286 93 0.9264 0.9096 0.9179 166 0.9789 0.9789 0.9789 142 0.9338 0.9501 0.9419 0.9874
0.0089 94.0 9024 0.0513 0.8824 0.9677 0.9231 93 0.9202 0.9036 0.9119 166 0.9789 0.9789 0.9789 142 0.9312 0.9451 0.9381 0.9871
0.0087 95.0 9120 0.0509 0.8824 0.9677 0.9231 93 0.9259 0.9036 0.9146 166 0.9789 0.9789 0.9789 142 0.9335 0.9451 0.9393 0.9874
0.0082 96.0 9216 0.0506 0.8911 0.9677 0.9278 93 0.9202 0.9036 0.9119 166 0.9789 0.9789 0.9789 142 0.9335 0.9451 0.9393 0.9874
0.0093 97.0 9312 0.0512 0.8835 0.9785 0.9286 93 0.9268 0.9157 0.9212 166 0.9789 0.9789 0.9789 142 0.9340 0.9526 0.9432 0.9877
0.0083 98.0 9408 0.0509 0.8835 0.9785 0.9286 93 0.9268 0.9157 0.9212 166 0.9789 0.9789 0.9789 142 0.9340 0.9526 0.9432 0.9877
0.0075 99.0 9504 0.0507 0.8835 0.9785 0.9286 93 0.9268 0.9157 0.9212 166 0.9789 0.9789 0.9789 142 0.9340 0.9526 0.9432 0.9877
0.0094 100.0 9600 0.0508 0.8835 0.9785 0.9286 93 0.9268 0.9157 0.9212 166 0.9789 0.9789 0.9789 142 0.9340 0.9526 0.9432 0.9877

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-seq_bn-2

Finetuned
(366)
this model