nerui-pt-pl5-3

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0592
  • Location Precision: 0.8696
  • Location Recall: 0.9302
  • Location F1: 0.8989
  • Location Number: 86
  • Organization Precision: 0.9545
  • Organization Recall: 0.9438
  • Organization F1: 0.9492
  • Organization Number: 178
  • Person Precision: 0.9764
  • Person Recall: 0.9688
  • Person F1: 0.9725
  • Person Number: 128
  • Overall Precision: 0.9418
  • Overall Recall: 0.9490
  • Overall F1: 0.9454
  • Overall Accuracy: 0.9881

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8675 1.0 96 0.3662 0.25 0.0116 0.0222 86 0.2826 0.3652 0.3186 178 0.3011 0.4141 0.3487 128 0.2902 0.3036 0.2968 0.8794
0.3489 2.0 192 0.2010 0.4141 0.4767 0.4432 86 0.6627 0.6292 0.6455 178 0.76 0.8906 0.8201 128 0.6388 0.6811 0.6593 0.9428
0.2001 3.0 288 0.1061 0.75 0.8023 0.7753 86 0.7577 0.8258 0.7903 178 0.9618 0.9844 0.9730 128 0.8201 0.8724 0.8455 0.9690
0.1393 4.0 384 0.0960 0.7075 0.8721 0.7812 86 0.7821 0.7865 0.7843 178 0.9470 0.9766 0.9615 128 0.8153 0.8673 0.8405 0.9717
0.1152 5.0 480 0.0724 0.6893 0.8256 0.7513 86 0.8475 0.8427 0.8451 178 0.9843 0.9766 0.9804 128 0.8501 0.8827 0.8661 0.9752
0.0966 6.0 576 0.0693 0.7232 0.9419 0.8182 86 0.8848 0.8202 0.8513 178 0.9766 0.9766 0.9766 128 0.8691 0.8980 0.8833 0.9773
0.0885 7.0 672 0.0607 0.6697 0.8488 0.7487 86 0.8857 0.8708 0.8782 178 0.9612 0.9688 0.9650 128 0.8523 0.8980 0.8745 0.9773
0.0843 8.0 768 0.0572 0.8065 0.8721 0.8380 86 0.8811 0.9157 0.8981 178 0.9688 0.9688 0.9688 128 0.8916 0.9235 0.9073 0.9814
0.0746 9.0 864 0.0530 0.7476 0.8953 0.8148 86 0.9226 0.8708 0.8960 178 0.9766 0.9766 0.9766 128 0.8947 0.9107 0.9027 0.9811
0.068 10.0 960 0.0578 0.7105 0.9419 0.8100 86 0.8876 0.8427 0.8646 178 0.9766 0.9766 0.9766 128 0.8662 0.9082 0.8867 0.9784
0.0648 11.0 1056 0.0619 0.7745 0.9186 0.8404 86 0.9136 0.8315 0.8706 178 0.9690 0.9766 0.9728 128 0.8957 0.8980 0.8968 0.9792
0.0601 12.0 1152 0.0443 0.8242 0.8721 0.8475 86 0.9048 0.9607 0.9319 178 0.9690 0.9766 0.9728 128 0.9071 0.9464 0.9263 0.9849
0.052 13.0 1248 0.0547 0.8421 0.9302 0.8840 86 0.9401 0.8820 0.9101 178 0.9615 0.9766 0.9690 128 0.9235 0.9235 0.9235 0.9814
0.0514 14.0 1344 0.0455 0.8387 0.9070 0.8715 86 0.9360 0.9045 0.9200 178 0.9766 0.9766 0.9766 128 0.9262 0.9286 0.9274 0.9854
0.054 15.0 1440 0.0495 0.8667 0.9070 0.8864 86 0.8889 0.8989 0.8939 178 0.9688 0.9688 0.9688 128 0.9095 0.9235 0.9165 0.9849
0.0477 16.0 1536 0.0453 0.8404 0.9186 0.8778 86 0.9045 0.9045 0.9045 178 0.9843 0.9766 0.9804 128 0.9148 0.9311 0.9229 0.9860
0.0476 17.0 1632 0.0503 0.8602 0.9302 0.8939 86 0.9213 0.9213 0.9213 178 0.9843 0.9766 0.9804 128 0.9271 0.9413 0.9342 0.9860
0.0447 18.0 1728 0.0499 0.7822 0.9186 0.8449 86 0.9048 0.8539 0.8786 178 0.9766 0.9766 0.9766 128 0.8967 0.9082 0.9024 0.9830
0.0463 19.0 1824 0.0458 0.8778 0.9186 0.8977 86 0.9375 0.9270 0.9322 178 0.9843 0.9766 0.9804 128 0.9389 0.9413 0.9401 0.9876
0.0404 20.0 1920 0.0572 0.7434 0.9767 0.8442 86 0.9226 0.8708 0.8960 178 0.9766 0.9766 0.9766 128 0.8900 0.9286 0.9089 0.9827
0.0389 21.0 2016 0.0421 0.8333 0.9302 0.8791 86 0.92 0.9045 0.9122 178 0.9766 0.9766 0.9766 128 0.9173 0.9337 0.9254 0.9876
0.0386 22.0 2112 0.0478 0.8137 0.9651 0.8830 86 0.9249 0.8989 0.9117 178 0.9764 0.9688 0.9725 128 0.9129 0.9362 0.9244 0.9849
0.0351 23.0 2208 0.0482 0.7980 0.9186 0.8541 86 0.9235 0.8820 0.9023 178 0.9843 0.9766 0.9804 128 0.9116 0.9209 0.9162 0.9857
0.0362 24.0 2304 0.0478 0.8351 0.9419 0.8852 86 0.9050 0.9101 0.9076 178 0.9843 0.9766 0.9804 128 0.9132 0.9388 0.9258 0.9860
0.0325 25.0 2400 0.0490 0.8163 0.9302 0.8696 86 0.9070 0.8764 0.8914 178 0.9690 0.9766 0.9728 128 0.9048 0.9209 0.9128 0.9846
0.0308 26.0 2496 0.0417 0.8710 0.9419 0.9050 86 0.9157 0.9157 0.9157 178 0.9766 0.9766 0.9766 128 0.9248 0.9413 0.9330 0.9879
0.0335 27.0 2592 0.0569 0.7961 0.9535 0.8677 86 0.9321 0.8483 0.8882 178 0.9766 0.9766 0.9766 128 0.9109 0.9133 0.9121 0.9819
0.0319 28.0 2688 0.0401 0.9412 0.9302 0.9357 86 0.9454 0.9719 0.9584 178 0.9843 0.9766 0.9804 128 0.9570 0.9643 0.9606 0.9903
0.0288 29.0 2784 0.0400 0.8495 0.9186 0.8827 86 0.9494 0.9494 0.9494 178 0.9843 0.9766 0.9804 128 0.9372 0.9515 0.9443 0.9897
0.0303 30.0 2880 0.0523 0.8542 0.9535 0.9011 86 0.9419 0.9101 0.9257 178 0.9766 0.9766 0.9766 128 0.9318 0.9413 0.9365 0.9854
0.0283 31.0 2976 0.0433 0.8696 0.9302 0.8989 86 0.9591 0.9213 0.9398 178 0.9766 0.9766 0.9766 128 0.9437 0.9413 0.9425 0.9876
0.0287 32.0 3072 0.0520 0.81 0.9419 0.8710 86 0.9209 0.9157 0.9183 178 0.9766 0.9766 0.9766 128 0.9111 0.9413 0.9260 0.9846
0.0265 33.0 3168 0.0496 0.8384 0.9651 0.8973 86 0.9477 0.9157 0.9314 178 0.9764 0.9688 0.9725 128 0.9296 0.9439 0.9367 0.9862
0.0274 34.0 3264 0.0448 0.8646 0.9651 0.9121 86 0.9330 0.9382 0.9356 178 0.9843 0.9766 0.9804 128 0.9328 0.9566 0.9446 0.9876
0.0286 35.0 3360 0.0495 0.8542 0.9535 0.9011 86 0.9306 0.9045 0.9174 178 0.9766 0.9766 0.9766 128 0.9270 0.9388 0.9328 0.9860
0.026 36.0 3456 0.0493 0.82 0.9535 0.8817 86 0.9364 0.9101 0.9231 178 0.9766 0.9766 0.9766 128 0.9202 0.9413 0.9306 0.9865
0.0251 37.0 3552 0.0467 0.8681 0.9186 0.8927 86 0.9419 0.9101 0.9257 178 0.9764 0.9688 0.9725 128 0.9359 0.9311 0.9335 0.9884
0.0251 38.0 3648 0.0442 0.9 0.9419 0.9205 86 0.9425 0.9213 0.9318 178 0.9766 0.9766 0.9766 128 0.9439 0.9439 0.9439 0.9887
0.0232 39.0 3744 0.0503 0.8913 0.9535 0.9213 86 0.9318 0.9213 0.9266 178 0.9766 0.9766 0.9766 128 0.9369 0.9464 0.9416 0.9876
0.0235 40.0 3840 0.0449 0.9101 0.9419 0.9257 86 0.9185 0.9494 0.9337 178 0.9766 0.9766 0.9766 128 0.9352 0.9566 0.9458 0.9887
0.0216 41.0 3936 0.0555 0.8864 0.9070 0.8966 86 0.8950 0.9101 0.9025 178 0.9764 0.9688 0.9725 128 0.9192 0.9286 0.9239 0.9846
0.0213 42.0 4032 0.0546 0.8696 0.9302 0.8989 86 0.9195 0.8989 0.9091 178 0.9766 0.9766 0.9766 128 0.9264 0.9311 0.9288 0.9852
0.0207 43.0 4128 0.0578 0.8817 0.9535 0.9162 86 0.9281 0.8708 0.8986 178 0.9764 0.9688 0.9725 128 0.9328 0.9209 0.9268 0.9860
0.0222 44.0 4224 0.0495 0.8791 0.9302 0.9040 86 0.9302 0.8989 0.9143 178 0.9685 0.9609 0.9647 128 0.9308 0.9260 0.9284 0.9876
0.0198 45.0 4320 0.0548 0.8632 0.9535 0.9061 86 0.9306 0.9045 0.9174 178 0.9766 0.9766 0.9766 128 0.9293 0.9388 0.9340 0.9873
0.0206 46.0 4416 0.0562 0.8617 0.9419 0.9000 86 0.9425 0.9213 0.9318 178 0.9766 0.9766 0.9766 128 0.9343 0.9439 0.9391 0.9862
0.02 47.0 4512 0.0558 0.8710 0.9419 0.9050 86 0.9322 0.9270 0.9296 178 0.9764 0.9688 0.9725 128 0.9320 0.9439 0.9379 0.9860
0.0193 48.0 4608 0.0496 0.8710 0.9419 0.9050 86 0.9535 0.9213 0.9371 178 0.9764 0.9688 0.9725 128 0.9413 0.9413 0.9413 0.9887
0.0197 49.0 4704 0.0511 0.8617 0.9419 0.9000 86 0.9586 0.9101 0.9337 178 0.9764 0.9688 0.9725 128 0.9410 0.9362 0.9386 0.9879
0.0191 50.0 4800 0.0537 0.8804 0.9419 0.9101 86 0.9429 0.9270 0.9348 178 0.9688 0.9688 0.9688 128 0.9367 0.9439 0.9403 0.9879
0.0184 51.0 4896 0.0520 0.875 0.9767 0.9231 86 0.9474 0.9101 0.9284 178 0.9766 0.9766 0.9766 128 0.9392 0.9464 0.9428 0.9881
0.0175 52.0 4992 0.0613 0.8723 0.9535 0.9111 86 0.9375 0.9270 0.9322 178 0.9764 0.9688 0.9725 128 0.9345 0.9464 0.9404 0.9862
0.017 53.0 5088 0.0533 0.8617 0.9419 0.9000 86 0.9261 0.9157 0.9209 178 0.9764 0.9688 0.9725 128 0.9270 0.9388 0.9328 0.9876
0.0184 54.0 5184 0.0515 0.8764 0.9070 0.8914 86 0.9330 0.9382 0.9356 178 0.9841 0.9688 0.9764 128 0.9365 0.9413 0.9389 0.9879
0.0157 55.0 5280 0.0471 0.8804 0.9419 0.9101 86 0.9379 0.9326 0.9352 178 0.9762 0.9609 0.9685 128 0.9367 0.9439 0.9403 0.9889
0.0183 56.0 5376 0.0491 0.8696 0.9302 0.8989 86 0.9425 0.9213 0.9318 178 0.9764 0.9688 0.9725 128 0.9364 0.9388 0.9376 0.9881
0.0147 57.0 5472 0.0583 0.8817 0.9535 0.9162 86 0.9171 0.9326 0.9248 178 0.9764 0.9688 0.9725 128 0.9277 0.9490 0.9382 0.9865
0.0157 58.0 5568 0.0594 0.8723 0.9535 0.9111 86 0.9425 0.9213 0.9318 178 0.9764 0.9688 0.9725 128 0.9367 0.9439 0.9403 0.9868
0.016 59.0 5664 0.0487 0.8876 0.9186 0.9029 86 0.9494 0.9494 0.9494 178 0.9764 0.9688 0.9725 128 0.9442 0.9490 0.9466 0.9887
0.0149 60.0 5760 0.0485 0.8542 0.9535 0.9011 86 0.9480 0.9213 0.9345 178 0.9764 0.9688 0.9725 128 0.9343 0.9439 0.9391 0.9884
0.0149 61.0 5856 0.0494 0.8602 0.9302 0.8939 86 0.9489 0.9382 0.9435 178 0.9764 0.9688 0.9725 128 0.9369 0.9464 0.9416 0.9884
0.0152 62.0 5952 0.0518 0.8723 0.9535 0.9111 86 0.9379 0.9326 0.9352 178 0.9764 0.9688 0.9725 128 0.9347 0.9490 0.9418 0.9879
0.0158 63.0 6048 0.0566 0.8817 0.9535 0.9162 86 0.9435 0.9382 0.9408 178 0.9764 0.9688 0.9725 128 0.9395 0.9515 0.9455 0.9876
0.0145 64.0 6144 0.0486 0.8989 0.9302 0.9143 86 0.9432 0.9326 0.9379 178 0.9764 0.9688 0.9725 128 0.9439 0.9439 0.9439 0.9884
0.0151 65.0 6240 0.0520 0.8876 0.9186 0.9029 86 0.9379 0.9326 0.9352 178 0.9764 0.9688 0.9725 128 0.9389 0.9413 0.9401 0.9879
0.0138 66.0 6336 0.0626 0.8557 0.9651 0.9071 86 0.9322 0.9270 0.9296 178 0.9764 0.9688 0.9725 128 0.9277 0.9490 0.9382 0.9862
0.0139 67.0 6432 0.0595 0.8804 0.9419 0.9101 86 0.9368 0.9157 0.9261 178 0.9764 0.9688 0.9725 128 0.9364 0.9388 0.9376 0.9873
0.0128 68.0 6528 0.0603 0.8696 0.9302 0.8989 86 0.9314 0.9157 0.9235 178 0.9764 0.9688 0.9725 128 0.9315 0.9362 0.9338 0.9862
0.0133 69.0 6624 0.0587 0.8681 0.9186 0.8927 86 0.9385 0.9438 0.9412 178 0.9764 0.9688 0.9725 128 0.9345 0.9464 0.9404 0.9862
0.0147 70.0 6720 0.0565 0.8696 0.9302 0.8989 86 0.9314 0.9157 0.9235 178 0.9764 0.9688 0.9725 128 0.9315 0.9362 0.9338 0.9873
0.014 71.0 6816 0.0551 0.8791 0.9302 0.9040 86 0.9379 0.9326 0.9352 178 0.9766 0.9766 0.9766 128 0.9369 0.9464 0.9416 0.9868
0.0125 72.0 6912 0.0551 0.8889 0.9302 0.9091 86 0.9543 0.9382 0.9462 178 0.9766 0.9766 0.9766 128 0.9466 0.9490 0.9478 0.9892
0.0128 73.0 7008 0.0554 0.8696 0.9302 0.8989 86 0.96 0.9438 0.9518 178 0.9764 0.9688 0.9725 128 0.9442 0.9490 0.9466 0.9881
0.0121 74.0 7104 0.0591 0.8617 0.9419 0.9000 86 0.9492 0.9438 0.9465 178 0.9841 0.9688 0.9764 128 0.9395 0.9515 0.9455 0.9876
0.0114 75.0 7200 0.0618 0.8438 0.9419 0.8901 86 0.9593 0.9270 0.9429 178 0.9764 0.9688 0.9725 128 0.9367 0.9439 0.9403 0.9870
0.0127 76.0 7296 0.0610 0.8696 0.9302 0.8989 86 0.96 0.9438 0.9518 178 0.9764 0.9688 0.9725 128 0.9442 0.9490 0.9466 0.9870
0.0105 77.0 7392 0.0619 0.8587 0.9186 0.8876 86 0.9492 0.9438 0.9465 178 0.9764 0.9688 0.9725 128 0.9369 0.9464 0.9416 0.9868
0.0126 78.0 7488 0.0615 0.8723 0.9535 0.9111 86 0.9598 0.9382 0.9489 178 0.9766 0.9766 0.9766 128 0.9444 0.9541 0.9492 0.9879
0.0129 79.0 7584 0.0597 0.8587 0.9186 0.8876 86 0.9540 0.9326 0.9432 178 0.9764 0.9688 0.9725 128 0.9389 0.9413 0.9401 0.9868
0.0117 80.0 7680 0.0651 0.8696 0.9302 0.8989 86 0.9322 0.9270 0.9296 178 0.9764 0.9688 0.9725 128 0.9318 0.9413 0.9365 0.9857
0.0112 81.0 7776 0.0596 0.8571 0.9070 0.8814 86 0.9337 0.9494 0.9415 178 0.9764 0.9688 0.9725 128 0.9298 0.9464 0.9381 0.9873
0.0108 82.0 7872 0.0616 0.8696 0.9302 0.8989 86 0.9438 0.9438 0.9438 178 0.9764 0.9688 0.9725 128 0.9370 0.9490 0.9430 0.9879
0.0109 83.0 7968 0.0610 0.8387 0.9070 0.8715 86 0.9326 0.9326 0.9326 178 0.9764 0.9688 0.9725 128 0.9246 0.9388 0.9316 0.9870
0.0105 84.0 8064 0.0619 0.8478 0.9070 0.8764 86 0.9492 0.9438 0.9465 178 0.9764 0.9688 0.9725 128 0.9343 0.9439 0.9391 0.9876
0.0111 85.0 8160 0.0611 0.8478 0.9070 0.8764 86 0.9438 0.9438 0.9438 178 0.9764 0.9688 0.9725 128 0.9320 0.9439 0.9379 0.9873
0.0109 86.0 8256 0.0596 0.8495 0.9186 0.8827 86 0.9492 0.9438 0.9465 178 0.9764 0.9688 0.9725 128 0.9345 0.9464 0.9404 0.9879
0.0106 87.0 8352 0.0601 0.8571 0.9070 0.8814 86 0.9492 0.9438 0.9465 178 0.9764 0.9688 0.9725 128 0.9367 0.9439 0.9403 0.9873
0.0097 88.0 8448 0.0602 0.8817 0.9535 0.9162 86 0.96 0.9438 0.9518 178 0.9764 0.9688 0.9725 128 0.9468 0.9541 0.9504 0.9889
0.0094 89.0 8544 0.0606 0.8804 0.9419 0.9101 86 0.9545 0.9438 0.9492 178 0.9764 0.9688 0.9725 128 0.9443 0.9515 0.9479 0.9881
0.0103 90.0 8640 0.0609 0.8696 0.9302 0.8989 86 0.9492 0.9438 0.9465 178 0.9764 0.9688 0.9725 128 0.9394 0.9490 0.9442 0.9881
0.0113 91.0 8736 0.0585 0.8791 0.9302 0.9040 86 0.9545 0.9438 0.9492 178 0.9764 0.9688 0.9725 128 0.9442 0.9490 0.9466 0.9884
0.0116 92.0 8832 0.0592 0.8696 0.9302 0.8989 86 0.9435 0.9382 0.9408 178 0.9764 0.9688 0.9725 128 0.9369 0.9464 0.9416 0.9881
0.0115 93.0 8928 0.0589 0.8804 0.9419 0.9101 86 0.9492 0.9438 0.9465 178 0.9764 0.9688 0.9725 128 0.9419 0.9515 0.9467 0.9881
0.0097 94.0 9024 0.0598 0.8710 0.9419 0.9050 86 0.96 0.9438 0.9518 178 0.9764 0.9688 0.9725 128 0.9443 0.9515 0.9479 0.9884
0.0092 95.0 9120 0.0598 0.8804 0.9419 0.9101 86 0.9545 0.9438 0.9492 178 0.9764 0.9688 0.9725 128 0.9443 0.9515 0.9479 0.9884
0.0095 96.0 9216 0.0586 0.8696 0.9302 0.8989 86 0.9545 0.9438 0.9492 178 0.9764 0.9688 0.9725 128 0.9418 0.9490 0.9454 0.9881
0.0088 97.0 9312 0.0596 0.8696 0.9302 0.8989 86 0.9545 0.9438 0.9492 178 0.9764 0.9688 0.9725 128 0.9418 0.9490 0.9454 0.9881
0.0086 98.0 9408 0.0593 0.8696 0.9302 0.8989 86 0.9545 0.9438 0.9492 178 0.9764 0.9688 0.9725 128 0.9418 0.9490 0.9454 0.9881
0.0086 99.0 9504 0.0591 0.8696 0.9302 0.8989 86 0.9545 0.9438 0.9492 178 0.9764 0.9688 0.9725 128 0.9418 0.9490 0.9454 0.9881
0.009 100.0 9600 0.0592 0.8696 0.9302 0.8989 86 0.9545 0.9438 0.9492 178 0.9764 0.9688 0.9725 128 0.9418 0.9490 0.9454 0.9881

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for apwic/nerui-pt-pl5-3

Finetuned
(369)
this model