Edit model card

nerui-pt-pl20-2

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0659
  • Location Precision: 0.875
  • Location Recall: 0.9785
  • Location F1: 0.9239
  • Location Number: 93
  • Organization Precision: 0.9430
  • Organization Recall: 0.8976
  • Organization F1: 0.9198
  • Organization Number: 166
  • Person Precision: 0.9650
  • Person Recall: 0.9718
  • Person F1: 0.9684
  • Person Number: 142
  • Overall Precision: 0.9333
  • Overall Recall: 0.9426
  • Overall F1: 0.9380
  • Overall Accuracy: 0.9879

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8355 1.0 96 0.3860 0.125 0.0108 0.0198 93 0.2687 0.3253 0.2943 166 0.2477 0.3873 0.3022 142 0.2552 0.2743 0.2644 0.8705
0.3665 2.0 192 0.2359 0.2975 0.3871 0.3364 93 0.5309 0.5181 0.5244 166 0.4516 0.6901 0.5460 142 0.44 0.5486 0.4883 0.9243
0.201 3.0 288 0.0994 0.8523 0.8065 0.8287 93 0.6919 0.8253 0.7527 166 0.9589 0.9859 0.9722 142 0.8148 0.8778 0.8451 0.9701
0.1372 4.0 384 0.0864 0.7193 0.8817 0.7923 93 0.7737 0.8855 0.8258 166 0.9589 0.9859 0.9722 142 0.82 0.9202 0.8672 0.9734
0.1141 5.0 480 0.0641 0.8542 0.8817 0.8677 93 0.8333 0.8735 0.8529 166 0.9655 0.9859 0.9756 142 0.8843 0.9152 0.8995 0.9813
0.0964 6.0 576 0.0601 0.7521 0.9462 0.8381 93 0.8710 0.8133 0.8411 166 0.9790 0.9859 0.9825 142 0.8747 0.9052 0.8897 0.9811
0.0877 7.0 672 0.0540 0.83 0.8925 0.8601 93 0.8793 0.9217 0.9 166 0.9655 0.9859 0.9756 142 0.8974 0.9377 0.9171 0.9813
0.0764 8.0 768 0.0547 0.8214 0.9892 0.8976 93 0.9193 0.8916 0.9052 166 0.9858 0.9789 0.9823 142 0.9155 0.9451 0.9301 0.9827
0.0681 9.0 864 0.0442 0.8812 0.9570 0.9175 93 0.8902 0.9277 0.9086 166 0.9858 0.9789 0.9823 142 0.9205 0.9526 0.9363 0.9846
0.066 10.0 960 0.0491 0.8505 0.9785 0.91 93 0.9487 0.8916 0.9193 166 0.9789 0.9789 0.9789 142 0.9333 0.9426 0.9380 0.9855
0.0604 11.0 1056 0.0464 0.8889 0.9462 0.9167 93 0.9042 0.9096 0.9069 166 0.9789 0.9789 0.9789 142 0.9265 0.9426 0.9345 0.9860
0.059 12.0 1152 0.0488 0.8958 0.9247 0.9101 93 0.8889 0.9157 0.9021 166 0.9789 0.9789 0.9789 142 0.9218 0.9401 0.9309 0.9860
0.0545 13.0 1248 0.0419 0.8824 0.9677 0.9231 93 0.9212 0.9157 0.9184 166 0.9789 0.9789 0.9789 142 0.9315 0.9501 0.9407 0.9874
0.0525 14.0 1344 0.0446 0.8980 0.9462 0.9215 93 0.9118 0.9337 0.9226 166 0.9720 0.9789 0.9754 142 0.9294 0.9526 0.9409 0.9871
0.0478 15.0 1440 0.0429 0.8889 0.9462 0.9167 93 0.9036 0.9036 0.9036 166 0.9789 0.9789 0.9789 142 0.9263 0.9401 0.9332 0.9882
0.0466 16.0 1536 0.0460 0.8627 0.9462 0.9026 93 0.9091 0.9036 0.9063 166 0.9720 0.9789 0.9754 142 0.9195 0.9401 0.9297 0.9863
0.0442 17.0 1632 0.0399 0.8585 0.9785 0.9146 93 0.9198 0.8976 0.9085 166 0.9858 0.9789 0.9823 142 0.9267 0.9451 0.9358 0.9879
0.042 18.0 1728 0.0402 0.8654 0.9677 0.9137 93 0.9371 0.8976 0.9169 166 0.9789 0.9789 0.9789 142 0.9333 0.9426 0.9380 0.9882
0.0384 19.0 1824 0.0404 0.8585 0.9785 0.9146 93 0.9371 0.8976 0.9169 166 0.9789 0.9789 0.9789 142 0.9312 0.9451 0.9381 0.9882
0.0365 20.0 1920 0.0457 0.8812 0.9570 0.9175 93 0.8851 0.9277 0.9059 166 0.9718 0.9718 0.9718 142 0.9137 0.9501 0.9315 0.9860
0.0363 21.0 2016 0.0477 0.8273 0.9785 0.8966 93 0.9290 0.8675 0.8972 166 0.9789 0.9789 0.9789 142 0.9189 0.9327 0.9257 0.9855
0.0344 22.0 2112 0.0464 0.8725 0.9570 0.9128 93 0.8988 0.9096 0.9042 166 0.9653 0.9789 0.9720 142 0.9155 0.9451 0.9301 0.9863
0.0313 23.0 2208 0.0449 0.8824 0.9677 0.9231 93 0.9379 0.9096 0.9235 166 0.9789 0.9789 0.9789 142 0.9383 0.9476 0.9429 0.9885
0.0311 24.0 2304 0.0442 0.8824 0.9677 0.9231 93 0.9273 0.9217 0.9245 166 0.9789 0.9789 0.9789 142 0.9340 0.9526 0.9432 0.9879
0.031 25.0 2400 0.0506 0.8738 0.9677 0.9184 93 0.9494 0.9036 0.9259 166 0.9720 0.9789 0.9754 142 0.9381 0.9451 0.9416 0.9860
0.0294 26.0 2496 0.0510 0.8812 0.9570 0.9175 93 0.9539 0.8735 0.9119 166 0.9720 0.9789 0.9754 142 0.9419 0.9302 0.9360 0.9874
0.0283 27.0 2592 0.0515 0.8318 0.9570 0.89 93 0.9030 0.8976 0.9003 166 0.9789 0.9789 0.9789 142 0.9106 0.9401 0.9252 0.9852
0.0286 28.0 2688 0.0476 0.8571 0.9677 0.9091 93 0.9141 0.8976 0.9058 166 0.9789 0.9789 0.9789 142 0.9220 0.9426 0.9322 0.9857
0.0252 29.0 2784 0.0472 0.8476 0.9570 0.8990 93 0.9375 0.9036 0.9202 166 0.9720 0.9789 0.9754 142 0.9265 0.9426 0.9345 0.9868
0.0262 30.0 2880 0.0478 0.8713 0.9462 0.9072 93 0.9053 0.9217 0.9134 166 0.9720 0.9789 0.9754 142 0.9201 0.9476 0.9337 0.9877
0.0252 31.0 2976 0.0459 0.9 0.9677 0.9326 93 0.9202 0.9036 0.9119 166 0.9720 0.9789 0.9754 142 0.9335 0.9451 0.9393 0.9893
0.0228 32.0 3072 0.0474 0.8585 0.9785 0.9146 93 0.9363 0.8855 0.9102 166 0.9789 0.9789 0.9789 142 0.9309 0.9401 0.9355 0.9863
0.0226 33.0 3168 0.0494 0.8824 0.9677 0.9231 93 0.9255 0.8976 0.9113 166 0.9720 0.9789 0.9754 142 0.9310 0.9426 0.9368 0.9879
0.0212 34.0 3264 0.0513 0.8364 0.9892 0.9064 93 0.9477 0.8735 0.9091 166 0.9789 0.9789 0.9789 142 0.9284 0.9377 0.9330 0.9877
0.0221 35.0 3360 0.0475 0.8667 0.9785 0.9192 93 0.9375 0.9036 0.9202 166 0.9789 0.9789 0.9789 142 0.9337 0.9476 0.9406 0.9877
0.0216 36.0 3456 0.0497 0.8426 0.9785 0.9055 93 0.9477 0.8735 0.9091 166 0.9720 0.9789 0.9754 142 0.9282 0.9352 0.9317 0.9866
0.0197 37.0 3552 0.0447 0.8990 0.9570 0.9271 93 0.9273 0.9217 0.9245 166 0.9789 0.9789 0.9789 142 0.9384 0.9501 0.9442 0.9882
0.0201 38.0 3648 0.0532 0.8491 0.9677 0.9045 93 0.9241 0.8795 0.9012 166 0.9789 0.9789 0.9789 142 0.9236 0.9352 0.9294 0.9863
0.0179 39.0 3744 0.0524 0.8598 0.9892 0.9200 93 0.9542 0.8795 0.9154 166 0.9789 0.9789 0.9789 142 0.9378 0.9401 0.9390 0.9879
0.0174 40.0 3840 0.0529 0.8911 0.9677 0.9278 93 0.9434 0.9036 0.9231 166 0.9718 0.9718 0.9718 142 0.9403 0.9426 0.9415 0.9882
0.0173 41.0 3936 0.0475 0.9263 0.9462 0.9362 93 0.9212 0.9157 0.9184 166 0.9720 0.9789 0.9754 142 0.9404 0.9451 0.9428 0.9893
0.0169 42.0 4032 0.0544 0.8585 0.9785 0.9146 93 0.9363 0.8855 0.9102 166 0.9583 0.9718 0.9650 142 0.9238 0.9377 0.9307 0.9874
0.0166 43.0 4128 0.0543 0.8654 0.9677 0.9137 93 0.9359 0.8795 0.9068 166 0.9650 0.9718 0.9684 142 0.9280 0.9327 0.9303 0.9877
0.0173 44.0 4224 0.0546 0.89 0.9570 0.9223 93 0.9268 0.9157 0.9212 166 0.9583 0.9718 0.9650 142 0.9289 0.9451 0.9370 0.9877
0.0166 45.0 4320 0.0575 0.8654 0.9677 0.9137 93 0.9355 0.8735 0.9034 166 0.9650 0.9718 0.9684 142 0.9279 0.9302 0.9290 0.9868
0.0166 46.0 4416 0.0603 0.8571 0.9677 0.9091 93 0.9255 0.8976 0.9113 166 0.9720 0.9789 0.9754 142 0.9242 0.9426 0.9333 0.9863
0.0159 47.0 4512 0.0573 0.8911 0.9677 0.9278 93 0.95 0.9157 0.9325 166 0.9583 0.9718 0.9650 142 0.9383 0.9476 0.9429 0.9885
0.0146 48.0 4608 0.0633 0.8667 0.9785 0.9192 93 0.9542 0.8795 0.9154 166 0.9718 0.9718 0.9718 142 0.9375 0.9352 0.9363 0.9877
0.0129 49.0 4704 0.0630 0.8911 0.9677 0.9278 93 0.9207 0.9096 0.9152 166 0.9650 0.9718 0.9684 142 0.9289 0.9451 0.9370 0.9866
0.0141 50.0 4800 0.0630 0.8846 0.9892 0.9340 93 0.9371 0.8976 0.9169 166 0.9517 0.9718 0.9617 142 0.9289 0.9451 0.9370 0.9871
0.0135 51.0 4896 0.0608 0.8654 0.9677 0.9137 93 0.9551 0.8976 0.9255 166 0.9583 0.9718 0.9650 142 0.9332 0.9401 0.9366 0.9866
0.0134 52.0 4992 0.0661 0.8990 0.9570 0.9271 93 0.9329 0.9217 0.9273 166 0.9650 0.9718 0.9684 142 0.9360 0.9476 0.9418 0.9871
0.0131 53.0 5088 0.0592 0.9278 0.9677 0.9474 93 0.9390 0.9277 0.9333 166 0.9583 0.9718 0.9650 142 0.9432 0.9526 0.9479 0.9890
0.0127 54.0 5184 0.0612 0.875 0.9785 0.9239 93 0.9308 0.8916 0.9108 166 0.9720 0.9789 0.9754 142 0.9310 0.9426 0.9368 0.9868
0.0136 55.0 5280 0.0612 0.8835 0.9785 0.9286 93 0.9494 0.9036 0.9259 166 0.9517 0.9718 0.9617 142 0.9335 0.9451 0.9393 0.9877
0.0133 56.0 5376 0.0566 0.8585 0.9785 0.9146 93 0.9551 0.8976 0.9255 166 0.9583 0.9718 0.9650 142 0.9310 0.9426 0.9368 0.9885
0.0131 57.0 5472 0.0581 0.8932 0.9892 0.9388 93 0.9430 0.8976 0.9198 166 0.9650 0.9718 0.9684 142 0.9381 0.9451 0.9416 0.9882
0.0124 58.0 5568 0.0581 0.875 0.9785 0.9239 93 0.9430 0.8976 0.9198 166 0.9650 0.9718 0.9684 142 0.9333 0.9426 0.9380 0.9874
0.0131 59.0 5664 0.0589 0.8846 0.9892 0.9340 93 0.9494 0.9036 0.9259 166 0.9718 0.9718 0.9718 142 0.9406 0.9476 0.9441 0.9882
0.0108 60.0 5760 0.0575 0.8932 0.9892 0.9388 93 0.9623 0.9217 0.9415 166 0.9650 0.9718 0.9684 142 0.9457 0.9551 0.9504 0.9898
0.0097 61.0 5856 0.0581 0.91 0.9785 0.9430 93 0.95 0.9157 0.9325 166 0.9718 0.9718 0.9718 142 0.9478 0.9501 0.9489 0.9893
0.0102 62.0 5952 0.0607 0.9010 0.9785 0.9381 93 0.9430 0.8976 0.9198 166 0.9650 0.9718 0.9684 142 0.9403 0.9426 0.9415 0.9879
0.0112 63.0 6048 0.0644 0.8585 0.9785 0.9146 93 0.9363 0.8855 0.9102 166 0.9650 0.9718 0.9684 142 0.9261 0.9377 0.9318 0.9857
0.009 64.0 6144 0.0635 0.8824 0.9677 0.9231 93 0.9379 0.9096 0.9235 166 0.9650 0.9718 0.9684 142 0.9335 0.9451 0.9393 0.9885
0.0095 65.0 6240 0.0660 0.875 0.9785 0.9239 93 0.9375 0.9036 0.9202 166 0.9517 0.9718 0.9617 142 0.9267 0.9451 0.9358 0.9866
0.0092 66.0 6336 0.0665 0.8835 0.9785 0.9286 93 0.9430 0.8976 0.9198 166 0.9517 0.9718 0.9617 142 0.9310 0.9426 0.9368 0.9871
0.0103 67.0 6432 0.0615 0.8571 0.9677 0.9091 93 0.9313 0.8976 0.9141 166 0.9517 0.9718 0.9617 142 0.9195 0.9401 0.9297 0.9868
0.0086 68.0 6528 0.0615 0.8558 0.9570 0.9036 93 0.9430 0.8976 0.9198 166 0.9583 0.9718 0.9650 142 0.9261 0.9377 0.9318 0.9877
0.0097 69.0 6624 0.0620 0.8835 0.9785 0.9286 93 0.9423 0.8855 0.9130 166 0.9517 0.9718 0.9617 142 0.9307 0.9377 0.9342 0.9879
0.0083 70.0 6720 0.0625 0.8667 0.9785 0.9192 93 0.9304 0.8855 0.9074 166 0.9517 0.9718 0.9617 142 0.9216 0.9377 0.9295 0.9874
0.0079 71.0 6816 0.0575 0.9192 0.9785 0.9479 93 0.95 0.9157 0.9325 166 0.9583 0.9718 0.9650 142 0.9454 0.9501 0.9478 0.9896
0.0082 72.0 6912 0.0683 0.8585 0.9785 0.9146 93 0.9542 0.8795 0.9154 166 0.9650 0.9718 0.9684 142 0.9328 0.9352 0.9340 0.9874
0.0087 73.0 7008 0.0593 0.8762 0.9892 0.9293 93 0.9419 0.8795 0.9097 166 0.9517 0.9718 0.9617 142 0.9284 0.9377 0.9330 0.9882
0.007 74.0 7104 0.0607 0.8762 0.9892 0.9293 93 0.9490 0.8976 0.9226 166 0.9583 0.9718 0.9650 142 0.9335 0.9451 0.9393 0.9877
0.0082 75.0 7200 0.0617 0.8932 0.9892 0.9388 93 0.9487 0.8916 0.9193 166 0.9720 0.9789 0.9754 142 0.9428 0.9451 0.9440 0.9893
0.0075 76.0 7296 0.0615 0.9020 0.9892 0.9436 93 0.9563 0.9217 0.9387 166 0.9789 0.9789 0.9789 142 0.9505 0.9576 0.9540 0.9896
0.0073 77.0 7392 0.0674 0.8679 0.9892 0.9246 93 0.9542 0.8795 0.9154 166 0.9517 0.9718 0.9617 142 0.9307 0.9377 0.9342 0.9882
0.0073 78.0 7488 0.0666 0.8654 0.9677 0.9137 93 0.9363 0.8855 0.9102 166 0.9653 0.9789 0.9720 142 0.9284 0.9377 0.9330 0.9868
0.0084 79.0 7584 0.0674 0.8667 0.9785 0.9192 93 0.9427 0.8916 0.9164 166 0.9720 0.9789 0.9754 142 0.9333 0.9426 0.9380 0.9874
0.0078 80.0 7680 0.0658 0.8762 0.9892 0.9293 93 0.9487 0.8916 0.9193 166 0.9720 0.9789 0.9754 142 0.9381 0.9451 0.9416 0.9885
0.0083 81.0 7776 0.0630 0.875 0.9785 0.9239 93 0.9490 0.8976 0.9226 166 0.9583 0.9718 0.9650 142 0.9333 0.9426 0.9380 0.9879
0.0069 82.0 7872 0.0661 0.875 0.9785 0.9239 93 0.9481 0.8795 0.9125 166 0.9517 0.9718 0.9617 142 0.9305 0.9352 0.9328 0.9877
0.0068 83.0 7968 0.0675 0.8571 0.9677 0.9091 93 0.9487 0.8916 0.9193 166 0.9583 0.9718 0.9650 142 0.9284 0.9377 0.9330 0.9874
0.0062 84.0 8064 0.0648 0.8738 0.9677 0.9184 93 0.9613 0.8976 0.9283 166 0.9650 0.9718 0.9684 142 0.9401 0.9401 0.9401 0.9885
0.0066 85.0 8160 0.0631 0.9091 0.9677 0.9375 93 0.9451 0.9337 0.9394 166 0.9718 0.9718 0.9718 142 0.9457 0.9551 0.9504 0.9896
0.0061 86.0 8256 0.0701 0.8505 0.9785 0.91 93 0.9610 0.8916 0.9250 166 0.9650 0.9718 0.9684 142 0.9332 0.9401 0.9366 0.9871
0.0069 87.0 8352 0.0652 0.8725 0.9570 0.9128 93 0.9375 0.9036 0.9202 166 0.9650 0.9718 0.9684 142 0.9309 0.9401 0.9355 0.9879
0.0064 88.0 8448 0.0647 0.8654 0.9677 0.9137 93 0.9430 0.8976 0.9198 166 0.9789 0.9789 0.9789 142 0.9356 0.9426 0.9391 0.9879
0.0058 89.0 8544 0.0624 0.91 0.9785 0.9430 93 0.9437 0.9096 0.9264 166 0.9650 0.9718 0.9684 142 0.9429 0.9476 0.9453 0.9890
0.0058 90.0 8640 0.0643 0.8846 0.9892 0.9340 93 0.9490 0.8976 0.9226 166 0.9650 0.9718 0.9684 142 0.9381 0.9451 0.9416 0.9882
0.0067 91.0 8736 0.0670 0.8762 0.9892 0.9293 93 0.9551 0.8976 0.9255 166 0.9650 0.9718 0.9684 142 0.9381 0.9451 0.9416 0.9888
0.0058 92.0 8832 0.0675 0.8654 0.9677 0.9137 93 0.9430 0.8976 0.9198 166 0.9650 0.9718 0.9684 142 0.9309 0.9401 0.9355 0.9877
0.0054 93.0 8928 0.0676 0.8738 0.9677 0.9184 93 0.9430 0.8976 0.9198 166 0.9650 0.9718 0.9684 142 0.9332 0.9401 0.9366 0.9877
0.0054 94.0 9024 0.0666 0.875 0.9785 0.9239 93 0.9371 0.8976 0.9169 166 0.9650 0.9718 0.9684 142 0.9310 0.9426 0.9368 0.9879
0.0062 95.0 9120 0.0664 0.8835 0.9785 0.9286 93 0.9434 0.9036 0.9231 166 0.9650 0.9718 0.9684 142 0.9358 0.9451 0.9404 0.9879
0.0057 96.0 9216 0.0657 0.875 0.9785 0.9239 93 0.9371 0.8976 0.9169 166 0.9650 0.9718 0.9684 142 0.9310 0.9426 0.9368 0.9877
0.0059 97.0 9312 0.0659 0.875 0.9785 0.9239 93 0.9375 0.9036 0.9202 166 0.9650 0.9718 0.9684 142 0.9312 0.9451 0.9381 0.9879
0.0063 98.0 9408 0.0656 0.875 0.9785 0.9239 93 0.9371 0.8976 0.9169 166 0.9650 0.9718 0.9684 142 0.9310 0.9426 0.9368 0.9879
0.0056 99.0 9504 0.0658 0.875 0.9785 0.9239 93 0.9430 0.8976 0.9198 166 0.9650 0.9718 0.9684 142 0.9333 0.9426 0.9380 0.9879
0.0055 100.0 9600 0.0659 0.875 0.9785 0.9239 93 0.9430 0.8976 0.9198 166 0.9650 0.9718 0.9684 142 0.9333 0.9426 0.9380 0.9879

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-pt-pl20-2

Finetuned
(366)
this model