Edit model card

nerui-pt-pl10-3

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0600
  • Location Precision: 0.8646
  • Location Recall: 0.9651
  • Location F1: 0.9121
  • Location Number: 86
  • Organization Precision: 0.9480
  • Organization Recall: 0.9213
  • Organization F1: 0.9345
  • Organization Number: 178
  • Person Precision: 0.9766
  • Person Recall: 0.9766
  • Person F1: 0.9766
  • Person Number: 128
  • Overall Precision: 0.9370
  • Overall Recall: 0.9490
  • Overall F1: 0.9430
  • Overall Accuracy: 0.9879

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8464 1.0 96 0.3799 0.0 0.0 0.0 86 0.2409 0.2978 0.2663 178 0.3154 0.3672 0.3394 128 0.2667 0.2551 0.2608 0.8748
0.3634 2.0 192 0.2157 0.3509 0.4651 0.4 86 0.6235 0.5674 0.5941 178 0.5843 0.7578 0.6599 128 0.5385 0.6071 0.5707 0.9331
0.1975 3.0 288 0.0942 0.7831 0.7558 0.7692 86 0.7304 0.8371 0.7801 178 0.9470 0.9766 0.9615 128 0.8091 0.8648 0.8360 0.9692
0.1277 4.0 384 0.0909 0.7396 0.8256 0.7802 86 0.8110 0.7472 0.7778 178 0.9466 0.9688 0.9575 128 0.8389 0.8367 0.8378 0.9703
0.1039 5.0 480 0.0632 0.7551 0.8605 0.8043 86 0.8876 0.8876 0.8876 178 0.9688 0.9688 0.9688 128 0.8812 0.9082 0.8945 0.9784
0.0916 6.0 576 0.0719 0.6724 0.9070 0.7723 86 0.9032 0.7865 0.8408 178 0.9615 0.9766 0.9690 128 0.8554 0.875 0.8651 0.9744
0.0815 7.0 672 0.0528 0.8409 0.8605 0.8506 86 0.8743 0.9382 0.9051 178 0.9843 0.9766 0.9804 128 0.9015 0.9337 0.9173 0.9841
0.0776 8.0 768 0.0514 0.8824 0.8721 0.8772 86 0.8967 0.9270 0.9116 178 0.9764 0.9688 0.9725 128 0.9192 0.9286 0.9239 0.9830
0.0697 9.0 864 0.0519 0.7961 0.9535 0.8677 86 0.9070 0.8764 0.8914 178 0.9843 0.9766 0.9804 128 0.9030 0.9260 0.9144 0.9825
0.0641 10.0 960 0.0573 0.8119 0.9535 0.8770 86 0.9075 0.8820 0.8946 178 0.9766 0.9766 0.9766 128 0.9055 0.9286 0.9169 0.9803
0.0613 11.0 1056 0.0494 0.8646 0.9651 0.9121 86 0.9527 0.9045 0.9280 178 0.9766 0.9766 0.9766 128 0.9389 0.9413 0.9401 0.9854
0.0578 12.0 1152 0.0461 0.8438 0.9419 0.8901 86 0.8956 0.9157 0.9056 178 0.9690 0.9766 0.9728 128 0.9066 0.9413 0.9237 0.9857
0.0527 13.0 1248 0.0581 0.8367 0.9535 0.8913 86 0.9294 0.8876 0.9080 178 0.9538 0.9688 0.9612 128 0.9146 0.9286 0.9215 0.9827
0.0485 14.0 1344 0.0523 0.8298 0.9070 0.8667 86 0.9306 0.9045 0.9174 178 0.9615 0.9766 0.9690 128 0.9169 0.9286 0.9227 0.9830
0.0467 15.0 1440 0.0490 0.8778 0.9186 0.8977 86 0.9266 0.9213 0.9239 178 0.9615 0.9766 0.9690 128 0.9270 0.9388 0.9328 0.9849
0.0459 16.0 1536 0.0469 0.8421 0.9302 0.8840 86 0.9261 0.9157 0.9209 178 0.9612 0.9688 0.9650 128 0.9175 0.9362 0.9268 0.9865
0.0449 17.0 1632 0.0481 0.8632 0.9535 0.9061 86 0.9153 0.9101 0.9127 178 0.9615 0.9766 0.9690 128 0.9179 0.9413 0.9295 0.9854
0.0419 18.0 1728 0.0519 0.8511 0.9302 0.8889 86 0.9405 0.8876 0.9133 178 0.9766 0.9766 0.9766 128 0.9308 0.9260 0.9284 0.9843
0.0408 19.0 1824 0.0408 0.9080 0.9186 0.9133 86 0.9341 0.9551 0.9444 178 0.9766 0.9766 0.9766 128 0.9421 0.9541 0.9480 0.9879
0.0378 20.0 1920 0.0570 0.8081 0.9302 0.8649 86 0.9364 0.9101 0.9231 178 0.9766 0.9766 0.9766 128 0.9175 0.9362 0.9268 0.9852
0.0373 21.0 2016 0.0538 0.8632 0.9535 0.9061 86 0.9521 0.8933 0.9217 178 0.9690 0.9766 0.9728 128 0.9361 0.9337 0.9349 0.9846
0.0369 22.0 2112 0.0506 0.8351 0.9419 0.8852 86 0.9364 0.9101 0.9231 178 0.9690 0.9766 0.9728 128 0.9223 0.9388 0.9305 0.9854
0.0357 23.0 2208 0.0469 0.8646 0.9651 0.9121 86 0.9480 0.9213 0.9345 178 0.9690 0.9766 0.9728 128 0.9347 0.9490 0.9418 0.9879
0.0347 24.0 2304 0.0499 0.8817 0.9535 0.9162 86 0.9322 0.9270 0.9296 178 0.9766 0.9766 0.9766 128 0.9347 0.9490 0.9418 0.9860
0.034 25.0 2400 0.0472 0.8696 0.9302 0.8989 86 0.9368 0.9157 0.9261 178 0.9690 0.9766 0.9728 128 0.9316 0.9388 0.9352 0.9868
0.031 26.0 2496 0.0428 0.8737 0.9651 0.9171 86 0.9364 0.9101 0.9231 178 0.9766 0.9766 0.9766 128 0.9343 0.9439 0.9391 0.9879
0.0302 27.0 2592 0.0519 0.8317 0.9767 0.8984 86 0.9253 0.9045 0.9148 178 0.9766 0.9766 0.9766 128 0.9181 0.9439 0.9308 0.9868
0.0288 28.0 2688 0.0437 0.8901 0.9419 0.9153 86 0.9385 0.9438 0.9412 178 0.9843 0.9766 0.9804 128 0.9421 0.9541 0.9480 0.9879
0.0282 29.0 2784 0.0406 0.9011 0.9535 0.9266 86 0.9389 0.9494 0.9441 178 0.9843 0.9766 0.9804 128 0.9447 0.9592 0.9519 0.9903
0.0279 30.0 2880 0.0520 0.8469 0.9651 0.9022 86 0.9651 0.9326 0.9486 178 0.9766 0.9766 0.9766 128 0.9397 0.9541 0.9468 0.9862
0.0261 31.0 2976 0.0489 0.8317 0.9767 0.8984 86 0.9306 0.9045 0.9174 178 0.9766 0.9766 0.9766 128 0.9204 0.9439 0.9320 0.9868
0.0239 32.0 3072 0.0426 0.8542 0.9535 0.9011 86 0.9314 0.9157 0.9235 178 0.9766 0.9766 0.9766 128 0.9273 0.9439 0.9355 0.9889
0.0262 33.0 3168 0.0508 0.8617 0.9419 0.9000 86 0.9318 0.9213 0.9266 178 0.9766 0.9766 0.9766 128 0.9296 0.9439 0.9367 0.9865
0.0233 34.0 3264 0.0522 0.8710 0.9419 0.9050 86 0.9371 0.9213 0.9292 178 0.9766 0.9766 0.9766 128 0.9343 0.9439 0.9391 0.9868
0.0234 35.0 3360 0.0454 0.8557 0.9651 0.9071 86 0.9593 0.9270 0.9429 178 0.9843 0.9766 0.9804 128 0.9419 0.9515 0.9467 0.9889
0.0242 36.0 3456 0.0452 0.8632 0.9535 0.9061 86 0.9657 0.9494 0.9575 178 0.9843 0.9766 0.9804 128 0.9471 0.9592 0.9531 0.9892
0.0223 37.0 3552 0.0422 0.8723 0.9535 0.9111 86 0.9432 0.9326 0.9379 178 0.9766 0.9766 0.9766 128 0.9372 0.9515 0.9443 0.9892
0.0225 38.0 3648 0.0490 0.82 0.9535 0.8817 86 0.9357 0.8989 0.9169 178 0.9766 0.9766 0.9766 128 0.9198 0.9362 0.9279 0.9860
0.0233 39.0 3744 0.0521 0.8485 0.9767 0.9081 86 0.9326 0.9326 0.9326 178 0.9766 0.9766 0.9766 128 0.9259 0.9566 0.9410 0.9870
0.0197 40.0 3840 0.0488 0.8710 0.9419 0.9050 86 0.9235 0.9494 0.9363 178 0.9843 0.9766 0.9804 128 0.9305 0.9566 0.9434 0.9884
0.0205 41.0 3936 0.0494 0.8817 0.9535 0.9162 86 0.9379 0.9326 0.9352 178 0.9766 0.9766 0.9766 128 0.9372 0.9515 0.9443 0.9884
0.0206 42.0 4032 0.0436 0.8989 0.9302 0.9143 86 0.9379 0.9326 0.9352 178 0.9766 0.9766 0.9766 128 0.9416 0.9464 0.9440 0.9887
0.0194 43.0 4128 0.0476 0.8723 0.9535 0.9111 86 0.9486 0.9326 0.9405 178 0.9766 0.9766 0.9766 128 0.9395 0.9515 0.9455 0.9889
0.019 44.0 4224 0.0517 0.8454 0.9535 0.8962 86 0.9529 0.9101 0.9310 178 0.9766 0.9766 0.9766 128 0.9342 0.9413 0.9377 0.9876
0.0199 45.0 4320 0.0490 0.8632 0.9535 0.9061 86 0.9540 0.9326 0.9432 178 0.9766 0.9766 0.9766 128 0.9395 0.9515 0.9455 0.9881
0.0165 46.0 4416 0.0579 0.8660 0.9767 0.9180 86 0.9415 0.9045 0.9226 178 0.9766 0.9766 0.9766 128 0.9343 0.9439 0.9391 0.9862
0.0178 47.0 4512 0.0522 0.8542 0.9535 0.9011 86 0.9422 0.9157 0.9288 178 0.9843 0.9766 0.9804 128 0.9343 0.9439 0.9391 0.9876
0.0173 48.0 4608 0.0547 0.8632 0.9535 0.9061 86 0.9379 0.9326 0.9352 178 0.9843 0.9766 0.9804 128 0.9348 0.9515 0.9431 0.9879
0.0176 49.0 4704 0.0536 0.8632 0.9535 0.9061 86 0.9593 0.9270 0.9429 178 0.9766 0.9766 0.9766 128 0.9418 0.9490 0.9454 0.9873
0.0173 50.0 4800 0.0541 0.8681 0.9186 0.8927 86 0.9486 0.9326 0.9405 178 0.9843 0.9766 0.9804 128 0.9415 0.9439 0.9427 0.9870
0.016 51.0 4896 0.0534 0.8602 0.9302 0.8939 86 0.9408 0.8933 0.9164 178 0.9766 0.9766 0.9766 128 0.9333 0.9286 0.9309 0.9852
0.0144 52.0 4992 0.0561 0.8710 0.9419 0.9050 86 0.9368 0.9157 0.9261 178 0.9843 0.9766 0.9804 128 0.9365 0.9413 0.9389 0.9879
0.0163 53.0 5088 0.0604 0.8737 0.9651 0.9171 86 0.9322 0.9270 0.9296 178 0.9766 0.9766 0.9766 128 0.9325 0.9515 0.9419 0.9876
0.0157 54.0 5184 0.0543 0.8737 0.9651 0.9171 86 0.9368 0.9157 0.9261 178 0.9766 0.9766 0.9766 128 0.9345 0.9464 0.9404 0.9868
0.015 55.0 5280 0.0539 0.875 0.9767 0.9231 86 0.9368 0.9157 0.9261 178 0.9766 0.9766 0.9766 128 0.9347 0.9490 0.9418 0.9876
0.0124 56.0 5376 0.0579 0.8646 0.9651 0.9121 86 0.9360 0.9045 0.9200 178 0.9766 0.9766 0.9766 128 0.9318 0.9413 0.9365 0.9868
0.0143 57.0 5472 0.0590 0.84 0.9767 0.9032 86 0.9148 0.9045 0.9096 178 0.9766 0.9766 0.9766 128 0.9158 0.9439 0.9296 0.9860
0.014 58.0 5568 0.0608 0.8660 0.9767 0.9180 86 0.9415 0.9045 0.9226 178 0.9766 0.9766 0.9766 128 0.9343 0.9439 0.9391 0.9868
0.0139 59.0 5664 0.0554 0.8723 0.9535 0.9111 86 0.9435 0.9382 0.9408 178 0.9766 0.9766 0.9766 128 0.9373 0.9541 0.9456 0.9873
0.0147 60.0 5760 0.0585 0.8557 0.9651 0.9071 86 0.9480 0.9213 0.9345 178 0.9843 0.9766 0.9804 128 0.9370 0.9490 0.9430 0.9870
0.0136 61.0 5856 0.0562 0.8723 0.9535 0.9111 86 0.9435 0.9382 0.9408 178 0.9766 0.9766 0.9766 128 0.9373 0.9541 0.9456 0.9870
0.0134 62.0 5952 0.0526 0.8830 0.9651 0.9222 86 0.9282 0.9438 0.9359 178 0.9766 0.9766 0.9766 128 0.9330 0.9592 0.9459 0.9884
0.014 63.0 6048 0.0560 0.8925 0.9651 0.9274 86 0.9278 0.9382 0.9330 178 0.9766 0.9766 0.9766 128 0.9352 0.9566 0.9458 0.9876
0.0136 64.0 6144 0.0489 0.8913 0.9535 0.9213 86 0.9282 0.9438 0.9359 178 0.9766 0.9766 0.9766 128 0.9352 0.9566 0.9458 0.9892
0.0124 65.0 6240 0.0550 0.8925 0.9651 0.9274 86 0.9659 0.9551 0.9605 178 0.9843 0.9766 0.9804 128 0.9545 0.9643 0.9594 0.9895
0.0117 66.0 6336 0.0521 0.8632 0.9535 0.9061 86 0.9540 0.9326 0.9432 178 0.9766 0.9766 0.9766 128 0.9395 0.9515 0.9455 0.9887
0.0127 67.0 6432 0.0509 0.8632 0.9535 0.9061 86 0.9655 0.9438 0.9545 178 0.9843 0.9766 0.9804 128 0.9470 0.9566 0.9518 0.9892
0.012 68.0 6528 0.0563 0.9 0.9419 0.9205 86 0.9282 0.9438 0.9359 178 0.9766 0.9766 0.9766 128 0.9373 0.9541 0.9456 0.9879
0.0118 69.0 6624 0.0548 0.8660 0.9767 0.9180 86 0.9483 0.9270 0.9375 178 0.9766 0.9766 0.9766 128 0.9373 0.9541 0.9456 0.9887
0.0119 70.0 6720 0.0591 0.8557 0.9651 0.9071 86 0.9375 0.9270 0.9322 178 0.9766 0.9766 0.9766 128 0.9302 0.9515 0.9407 0.9873
0.0119 71.0 6816 0.0608 0.8913 0.9535 0.9213 86 0.9286 0.9494 0.9389 178 0.9766 0.9766 0.9766 128 0.9353 0.9592 0.9471 0.9870
0.0102 72.0 6912 0.0570 0.8817 0.9535 0.9162 86 0.9382 0.9382 0.9382 178 0.9766 0.9766 0.9766 128 0.9373 0.9541 0.9456 0.9879
0.0108 73.0 7008 0.0602 0.8571 0.9767 0.9130 86 0.9422 0.9157 0.9288 178 0.9766 0.9766 0.9766 128 0.9323 0.9490 0.9406 0.9876
0.0112 74.0 7104 0.0605 0.8542 0.9535 0.9011 86 0.9477 0.9157 0.9314 178 0.9766 0.9766 0.9766 128 0.9343 0.9439 0.9391 0.9870
0.011 75.0 7200 0.0666 0.84 0.9767 0.9032 86 0.9133 0.8876 0.9003 178 0.9766 0.9766 0.9766 128 0.9152 0.9362 0.9256 0.9857
0.0098 76.0 7296 0.0570 0.8723 0.9535 0.9111 86 0.9326 0.9326 0.9326 178 0.9766 0.9766 0.9766 128 0.9325 0.9515 0.9419 0.9879
0.0098 77.0 7392 0.0579 0.8646 0.9651 0.9121 86 0.9322 0.9270 0.9296 178 0.9766 0.9766 0.9766 128 0.9302 0.9515 0.9407 0.9876
0.0099 78.0 7488 0.0626 0.8571 0.9767 0.9130 86 0.9314 0.9157 0.9235 178 0.9766 0.9766 0.9766 128 0.9277 0.9490 0.9382 0.9865
0.0107 79.0 7584 0.0580 0.8830 0.9651 0.9222 86 0.9314 0.9157 0.9235 178 0.9766 0.9766 0.9766 128 0.9345 0.9464 0.9404 0.9873
0.0103 80.0 7680 0.0593 0.8660 0.9767 0.9180 86 0.9326 0.9326 0.9326 178 0.9688 0.9688 0.9688 128 0.9280 0.9541 0.9409 0.9881
0.0091 81.0 7776 0.0626 0.8660 0.9767 0.9180 86 0.9543 0.9382 0.9462 178 0.9766 0.9766 0.9766 128 0.94 0.9592 0.9495 0.9887
0.0105 82.0 7872 0.0579 0.8646 0.9651 0.9121 86 0.9314 0.9157 0.9235 178 0.9766 0.9766 0.9766 128 0.9298 0.9464 0.9381 0.9868
0.0103 83.0 7968 0.0657 0.8660 0.9767 0.9180 86 0.9527 0.9045 0.9280 178 0.9766 0.9766 0.9766 128 0.9391 0.9439 0.9415 0.9865
0.0118 84.0 8064 0.0636 0.8660 0.9767 0.9180 86 0.9364 0.9101 0.9231 178 0.9766 0.9766 0.9766 128 0.9322 0.9464 0.9392 0.9865
0.0095 85.0 8160 0.0632 0.8632 0.9535 0.9061 86 0.9310 0.9101 0.9205 178 0.9766 0.9766 0.9766 128 0.9295 0.9413 0.9354 0.9868
0.0089 86.0 8256 0.0606 0.8660 0.9767 0.9180 86 0.9415 0.9045 0.9226 178 0.9766 0.9766 0.9766 128 0.9343 0.9439 0.9391 0.9862
0.0086 87.0 8352 0.0599 0.8557 0.9651 0.9071 86 0.9480 0.9213 0.9345 178 0.9766 0.9766 0.9766 128 0.9347 0.9490 0.9418 0.9879
0.0104 88.0 8448 0.0596 0.8646 0.9651 0.9121 86 0.9483 0.9270 0.9375 178 0.9766 0.9766 0.9766 128 0.9372 0.9515 0.9443 0.9892
0.0081 89.0 8544 0.0583 0.8646 0.9651 0.9121 86 0.9483 0.9270 0.9375 178 0.9766 0.9766 0.9766 128 0.9372 0.9515 0.9443 0.9895
0.0076 90.0 8640 0.0589 0.8646 0.9651 0.9121 86 0.9429 0.9270 0.9348 178 0.9766 0.9766 0.9766 128 0.9348 0.9515 0.9431 0.9887
0.0092 91.0 8736 0.0612 0.8646 0.9651 0.9121 86 0.9425 0.9213 0.9318 178 0.9766 0.9766 0.9766 128 0.9347 0.9490 0.9418 0.9881
0.008 92.0 8832 0.0596 0.8646 0.9651 0.9121 86 0.9480 0.9213 0.9345 178 0.9766 0.9766 0.9766 128 0.9370 0.9490 0.9430 0.9879
0.0086 93.0 8928 0.0602 0.8646 0.9651 0.9121 86 0.9314 0.9157 0.9235 178 0.9766 0.9766 0.9766 128 0.9298 0.9464 0.9381 0.9865
0.0063 94.0 9024 0.0605 0.8646 0.9651 0.9121 86 0.9593 0.9270 0.9429 178 0.9766 0.9766 0.9766 128 0.9419 0.9515 0.9467 0.9876
0.0078 95.0 9120 0.0594 0.8737 0.9651 0.9171 86 0.9543 0.9382 0.9462 178 0.9766 0.9766 0.9766 128 0.9422 0.9566 0.9494 0.9887
0.0079 96.0 9216 0.0588 0.8737 0.9651 0.9171 86 0.9483 0.9270 0.9375 178 0.9766 0.9766 0.9766 128 0.9395 0.9515 0.9455 0.9881
0.0083 97.0 9312 0.0615 0.8646 0.9651 0.9121 86 0.9593 0.9270 0.9429 178 0.9766 0.9766 0.9766 128 0.9419 0.9515 0.9467 0.9881
0.0075 98.0 9408 0.0611 0.8646 0.9651 0.9121 86 0.9535 0.9213 0.9371 178 0.9766 0.9766 0.9766 128 0.9394 0.9490 0.9442 0.9873
0.0083 99.0 9504 0.0600 0.8646 0.9651 0.9121 86 0.9535 0.9213 0.9371 178 0.9766 0.9766 0.9766 128 0.9394 0.9490 0.9442 0.9876
0.0074 100.0 9600 0.0600 0.8646 0.9651 0.9121 86 0.9480 0.9213 0.9345 178 0.9766 0.9766 0.9766 128 0.9370 0.9490 0.9430 0.9879

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-pt-pl10-3

Finetuned
this model