Edit model card

nerui-pt-pl10-1

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0446
  • Location Precision: 0.9244
  • Location Recall: 0.9483
  • Location F1: 0.9362
  • Location Number: 116
  • Organization Precision: 0.9671
  • Organization Recall: 0.9304
  • Organization F1: 0.9484
  • Organization Number: 158
  • Person Precision: 0.9609
  • Person Recall: 0.9919
  • Person F1: 0.9762
  • Person Number: 124
  • Overall Precision: 0.9524
  • Overall Recall: 0.9548
  • Overall F1: 0.9536
  • Overall Accuracy: 0.9907

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8274 1.0 96 0.3896 0.3077 0.0345 0.0620 116 0.2133 0.3861 0.2748 158 0.2734 0.3065 0.2890 124 0.2352 0.2588 0.2464 0.8652
0.3677 2.0 192 0.2166 0.3923 0.4397 0.4146 116 0.5385 0.5316 0.5350 158 0.5673 0.7823 0.6576 124 0.5077 0.5829 0.5427 0.9319
0.1945 3.0 288 0.0954 0.85 0.7328 0.7870 116 0.7113 0.8734 0.7841 158 0.9385 0.9839 0.9606 124 0.8137 0.8668 0.8394 0.9706
0.1293 4.0 384 0.0825 0.8110 0.8879 0.8477 116 0.7784 0.9114 0.8397 158 0.9683 0.9839 0.976 124 0.8425 0.9271 0.8828 0.9747
0.1104 5.0 480 0.0516 0.8860 0.8707 0.8783 116 0.8757 0.9367 0.9052 158 0.984 0.9919 0.9880 124 0.9118 0.9347 0.9231 0.9849
0.096 6.0 576 0.0490 0.7955 0.9052 0.8468 116 0.92 0.8734 0.8961 158 0.9609 0.9919 0.9762 124 0.8927 0.9196 0.9059 0.9830
0.0809 7.0 672 0.0442 0.9048 0.8190 0.8597 116 0.8621 0.9494 0.9036 158 0.9762 0.9919 0.9840 124 0.9086 0.9246 0.9166 0.9844
0.0757 8.0 768 0.0419 0.8730 0.9483 0.9091 116 0.9182 0.9241 0.9211 158 0.9762 0.9919 0.9840 124 0.9221 0.9523 0.9370 0.9868
0.0694 9.0 864 0.0336 0.8926 0.9310 0.9114 116 0.9509 0.9810 0.9657 158 0.9762 0.9919 0.9840 124 0.9415 0.9698 0.9554 0.9918
0.0663 10.0 960 0.0414 0.8682 0.9655 0.9143 116 0.9605 0.9241 0.9419 158 0.9535 0.9919 0.9723 124 0.9293 0.9573 0.9431 0.9871
0.0625 11.0 1056 0.0336 0.888 0.9569 0.9212 116 0.9487 0.9367 0.9427 158 0.9762 0.9919 0.9840 124 0.9386 0.9598 0.9491 0.9898
0.0567 12.0 1152 0.0411 0.9043 0.8966 0.9004 116 0.9494 0.9494 0.9494 158 0.9683 0.9839 0.976 124 0.9424 0.9447 0.9435 0.9887
0.0546 13.0 1248 0.0326 0.9145 0.9224 0.9185 116 0.9487 0.9367 0.9427 158 0.9762 0.9919 0.9840 124 0.9474 0.9497 0.9486 0.9898
0.0473 14.0 1344 0.0381 0.9016 0.9483 0.9244 116 0.9667 0.9177 0.9416 158 0.9606 0.9839 0.9721 124 0.9449 0.9472 0.9460 0.9882
0.0478 15.0 1440 0.0309 0.9091 0.9483 0.9283 116 0.9732 0.9177 0.9446 158 0.9762 0.9919 0.9840 124 0.9545 0.9497 0.9521 0.9898
0.0436 16.0 1536 0.0326 0.9391 0.9310 0.9351 116 0.9437 0.9557 0.9497 158 0.9762 0.9919 0.9840 124 0.9526 0.9598 0.9562 0.9915
0.0426 17.0 1632 0.0373 0.8889 0.9655 0.9256 116 0.96 0.9114 0.9351 158 0.9457 0.9839 0.9644 124 0.9333 0.9497 0.9415 0.9887
0.0399 18.0 1728 0.0320 0.9167 0.9483 0.9322 116 0.9548 0.9367 0.9457 158 0.976 0.9839 0.9799 124 0.95 0.9548 0.9524 0.9896
0.0373 19.0 1824 0.0373 0.9391 0.9310 0.9351 116 0.9494 0.9494 0.9494 158 0.9762 0.9919 0.9840 124 0.9549 0.9573 0.9561 0.9909
0.0354 20.0 1920 0.0333 0.9231 0.9310 0.9270 116 0.9671 0.9304 0.9484 158 0.9609 0.9919 0.9762 124 0.9521 0.9497 0.9509 0.9904
0.0372 21.0 2016 0.0331 0.9402 0.9483 0.9442 116 0.9605 0.9241 0.9419 158 0.9762 0.9919 0.9840 124 0.9595 0.9523 0.9559 0.9909
0.034 22.0 2112 0.0341 0.9310 0.9310 0.9310 116 0.9494 0.9494 0.9494 158 0.9762 0.9919 0.9840 124 0.9525 0.9573 0.9549 0.9898
0.0331 23.0 2208 0.0359 0.9316 0.9397 0.9356 116 0.9427 0.9367 0.9397 158 0.9762 0.9919 0.9840 124 0.95 0.9548 0.9524 0.9907
0.032 24.0 2304 0.0363 0.9397 0.9397 0.9397 116 0.9554 0.9494 0.9524 158 0.9762 0.9919 0.9840 124 0.9574 0.9598 0.9586 0.9904
0.0298 25.0 2400 0.0428 0.9407 0.9569 0.9487 116 0.9933 0.9367 0.9642 158 0.9535 0.9919 0.9723 124 0.9646 0.9598 0.9622 0.9915
0.0311 26.0 2496 0.0426 0.9237 0.9397 0.9316 116 0.9667 0.9177 0.9416 158 0.9685 0.9919 0.9801 124 0.9544 0.9472 0.9508 0.9898
0.0307 27.0 2592 0.0389 0.9316 0.9397 0.9356 116 0.9735 0.9304 0.9515 158 0.9609 0.9919 0.9762 124 0.9571 0.9523 0.9547 0.9907
0.0274 28.0 2688 0.0439 0.9153 0.9310 0.9231 116 0.9669 0.9241 0.9450 158 0.9609 0.9919 0.9762 124 0.9496 0.9472 0.9484 0.9896
0.0254 29.0 2784 0.0409 0.9187 0.9741 0.9456 116 0.9669 0.9241 0.9450 158 0.9609 0.9919 0.9762 124 0.9502 0.9598 0.9550 0.9901
0.0277 30.0 2880 0.0346 0.9469 0.9224 0.9345 116 0.9387 0.9684 0.9533 158 0.9609 0.9919 0.9762 124 0.9480 0.9623 0.9551 0.9915
0.0267 31.0 2976 0.0378 0.9402 0.9483 0.9442 116 0.9608 0.9304 0.9453 158 0.9535 0.9919 0.9723 124 0.9524 0.9548 0.9536 0.9896
0.026 32.0 3072 0.0435 0.9412 0.9655 0.9532 116 0.9932 0.9241 0.9574 158 0.9609 0.9919 0.9762 124 0.9670 0.9573 0.9621 0.9904
0.0241 33.0 3168 0.0361 0.9322 0.9483 0.9402 116 0.9799 0.9241 0.9511 158 0.984 0.9919 0.9880 124 0.9668 0.9523 0.9595 0.9907
0.0245 34.0 3264 0.0383 0.9565 0.9483 0.9524 116 0.9613 0.9430 0.9521 158 0.9683 0.9839 0.976 124 0.9621 0.9573 0.9597 0.9904
0.0228 35.0 3360 0.0424 0.9565 0.9483 0.9524 116 0.9737 0.9367 0.9548 158 0.9606 0.9839 0.9721 124 0.9645 0.9548 0.9596 0.9898
0.0218 36.0 3456 0.0417 0.9333 0.9655 0.9492 116 0.9667 0.9177 0.9416 158 0.9457 0.9839 0.9644 124 0.9499 0.9523 0.9511 0.9898
0.0221 37.0 3552 0.0416 0.9407 0.9569 0.9487 116 0.9608 0.9304 0.9453 158 0.9606 0.9839 0.9721 124 0.9548 0.9548 0.9548 0.9893
0.0233 38.0 3648 0.0446 0.9412 0.9655 0.9532 116 0.9730 0.9114 0.9412 158 0.9609 0.9919 0.9762 124 0.9595 0.9523 0.9559 0.9896
0.0197 39.0 3744 0.0419 0.9322 0.9483 0.9402 116 0.9419 0.9241 0.9329 158 0.9609 0.9919 0.9762 124 0.9451 0.9523 0.9487 0.9898
0.0203 40.0 3840 0.0392 0.9478 0.9397 0.9437 116 0.9427 0.9367 0.9397 158 0.9535 0.9919 0.9723 124 0.9476 0.9548 0.9512 0.9907
0.0191 41.0 3936 0.0337 0.9397 0.9397 0.9397 116 0.9560 0.9620 0.9590 158 0.9609 0.9919 0.9762 124 0.9529 0.9648 0.9588 0.9923
0.017 42.0 4032 0.0371 0.9237 0.9397 0.9316 116 0.9412 0.9114 0.9260 158 0.9457 0.9839 0.9644 124 0.9375 0.9422 0.9398 0.9879
0.0174 43.0 4128 0.0394 0.9478 0.9397 0.9437 116 0.9545 0.9304 0.9423 158 0.9685 0.9919 0.9801 124 0.9571 0.9523 0.9547 0.9912
0.018 44.0 4224 0.0450 0.9008 0.9397 0.9198 116 0.9669 0.9241 0.9450 158 0.9535 0.9919 0.9723 124 0.9426 0.9497 0.9462 0.9893
0.018 45.0 4320 0.0392 0.9237 0.9397 0.9316 116 0.9605 0.9241 0.9419 158 0.9609 0.9919 0.9762 124 0.9497 0.9497 0.9497 0.9893
0.0167 46.0 4416 0.0415 0.9231 0.9310 0.9270 116 0.9481 0.9241 0.9359 158 0.9609 0.9919 0.9762 124 0.9449 0.9472 0.9460 0.9893
0.0167 47.0 4512 0.0409 0.9397 0.9397 0.9397 116 0.98 0.9304 0.9545 158 0.9609 0.9919 0.9762 124 0.9619 0.9523 0.9571 0.9904
0.0156 48.0 4608 0.0417 0.925 0.9569 0.9407 116 0.9467 0.8987 0.9221 158 0.9762 0.9919 0.9840 124 0.9495 0.9447 0.9471 0.9890
0.0167 49.0 4704 0.0414 0.9492 0.9655 0.9573 116 0.9610 0.9367 0.9487 158 0.9609 0.9919 0.9762 124 0.9575 0.9623 0.9599 0.9909
0.0153 50.0 4800 0.0403 0.9328 0.9569 0.9447 116 0.9737 0.9367 0.9548 158 0.9609 0.9919 0.9762 124 0.9574 0.9598 0.9586 0.9909
0.0175 51.0 4896 0.0351 0.925 0.9569 0.9407 116 0.9608 0.9304 0.9453 158 0.9683 0.9839 0.976 124 0.9524 0.9548 0.9536 0.9904
0.015 52.0 4992 0.0383 0.925 0.9569 0.9407 116 0.9675 0.9430 0.9551 158 0.9535 0.9919 0.9723 124 0.9504 0.9623 0.9563 0.9909
0.0156 53.0 5088 0.0409 0.9167 0.9483 0.9322 116 0.9733 0.9241 0.9481 158 0.9535 0.9919 0.9723 124 0.9499 0.9523 0.9511 0.9901
0.0142 54.0 5184 0.0394 0.9381 0.9138 0.9258 116 0.9383 0.9620 0.95 158 0.9685 0.9919 0.9801 124 0.9478 0.9573 0.9525 0.9907
0.0165 55.0 5280 0.0384 0.9328 0.9569 0.9447 116 0.9545 0.9304 0.9423 158 0.9609 0.9919 0.9762 124 0.9501 0.9573 0.9537 0.9909
0.0138 56.0 5376 0.0411 0.9244 0.9483 0.9362 116 0.9675 0.9430 0.9551 158 0.9685 0.9919 0.9801 124 0.955 0.9598 0.9574 0.9907
0.0136 57.0 5472 0.0429 0.9160 0.9397 0.9277 116 0.9608 0.9304 0.9453 158 0.9609 0.9919 0.9762 124 0.9475 0.9523 0.9499 0.9904
0.0136 58.0 5568 0.0429 0.925 0.9569 0.9407 116 0.9671 0.9304 0.9484 158 0.9609 0.9919 0.9762 124 0.9525 0.9573 0.9549 0.9901
0.0138 59.0 5664 0.0395 0.9083 0.9397 0.9237 116 0.9355 0.9177 0.9265 158 0.9685 0.9919 0.9801 124 0.9378 0.9472 0.9425 0.9896
0.013 60.0 5760 0.0473 0.8976 0.9828 0.9383 116 0.9408 0.9051 0.9226 158 0.9762 0.9919 0.9840 124 0.9383 0.9548 0.9465 0.9890
0.0128 61.0 5856 0.0393 0.9237 0.9397 0.9316 116 0.9367 0.9367 0.9367 158 0.9535 0.9919 0.9723 124 0.9383 0.9548 0.9465 0.9904
0.011 62.0 5952 0.0399 0.9412 0.9655 0.9532 116 0.9613 0.9430 0.9521 158 0.976 0.9839 0.9799 124 0.9599 0.9623 0.9611 0.9915
0.0121 63.0 6048 0.0441 0.9174 0.9569 0.9367 116 0.9545 0.9304 0.9423 158 0.9457 0.9839 0.9644 124 0.9406 0.9548 0.9476 0.9904
0.0111 64.0 6144 0.0447 0.9402 0.9483 0.9442 116 0.9487 0.9367 0.9427 158 0.9457 0.9839 0.9644 124 0.9453 0.9548 0.95 0.9898
0.0117 65.0 6240 0.0431 0.9167 0.9483 0.9322 116 0.9608 0.9304 0.9453 158 0.9531 0.9839 0.9683 124 0.9451 0.9523 0.9487 0.9901
0.011 66.0 6336 0.0496 0.9187 0.9741 0.9456 116 0.9735 0.9304 0.9515 158 0.9457 0.9839 0.9644 124 0.9479 0.9598 0.9538 0.9907
0.0112 67.0 6432 0.0465 0.9244 0.9483 0.9362 116 0.9481 0.9241 0.9359 158 0.9457 0.9839 0.9644 124 0.9403 0.9497 0.9450 0.9901
0.011 68.0 6528 0.0464 0.9167 0.9483 0.9322 116 0.9608 0.9304 0.9453 158 0.9462 0.9919 0.9685 124 0.9429 0.9548 0.9488 0.9901
0.0095 69.0 6624 0.0453 0.9160 0.9397 0.9277 116 0.9548 0.9367 0.9457 158 0.9535 0.9919 0.9723 124 0.9429 0.9548 0.9488 0.9904
0.0118 70.0 6720 0.0483 0.9008 0.9397 0.9198 116 0.9494 0.9494 0.9494 158 0.9535 0.9919 0.9723 124 0.9363 0.9598 0.9479 0.9896
0.0111 71.0 6816 0.0469 0.9322 0.9483 0.9402 116 0.9613 0.9430 0.9521 158 0.9609 0.9919 0.9762 124 0.9526 0.9598 0.9562 0.9912
0.0104 72.0 6912 0.0474 0.9160 0.9397 0.9277 116 0.9605 0.9241 0.9419 158 0.9535 0.9919 0.9723 124 0.945 0.9497 0.9474 0.9898
0.0104 73.0 7008 0.0445 0.9167 0.9483 0.9322 116 0.9667 0.9177 0.9416 158 0.9535 0.9919 0.9723 124 0.9474 0.9497 0.9486 0.9898
0.0115 74.0 7104 0.0486 0.9316 0.9397 0.9356 116 0.9545 0.9304 0.9423 158 0.9457 0.9839 0.9644 124 0.945 0.9497 0.9474 0.9893
0.0109 75.0 7200 0.0507 0.9328 0.9569 0.9447 116 0.9608 0.9304 0.9453 158 0.9531 0.9839 0.9683 124 0.95 0.9548 0.9524 0.9893
0.0094 76.0 7296 0.0523 0.9098 0.9569 0.9328 116 0.9803 0.9430 0.9613 158 0.9609 0.9919 0.9762 124 0.9527 0.9623 0.9575 0.9898
0.0107 77.0 7392 0.0482 0.9174 0.9569 0.9367 116 0.9474 0.9114 0.9290 158 0.9606 0.9839 0.9721 124 0.9425 0.9472 0.9449 0.9901
0.0095 78.0 7488 0.0532 0.9322 0.9483 0.9402 116 0.9610 0.9367 0.9487 158 0.9457 0.9839 0.9644 124 0.9476 0.9548 0.9512 0.9896
0.0097 79.0 7584 0.0459 0.9316 0.9397 0.9356 116 0.9613 0.9430 0.9521 158 0.9609 0.9919 0.9762 124 0.9525 0.9573 0.9549 0.9912
0.0089 80.0 7680 0.0504 0.9153 0.9310 0.9231 116 0.9545 0.9304 0.9423 158 0.9609 0.9919 0.9762 124 0.945 0.9497 0.9474 0.9887
0.0095 81.0 7776 0.0468 0.9316 0.9397 0.9356 116 0.9363 0.9304 0.9333 158 0.9685 0.9919 0.9801 124 0.9451 0.9523 0.9487 0.9907
0.0091 82.0 7872 0.0458 0.9167 0.9483 0.9322 116 0.9735 0.9304 0.9515 158 0.9609 0.9919 0.9762 124 0.9524 0.9548 0.9536 0.9904
0.0084 83.0 7968 0.0522 0.9322 0.9483 0.9402 116 0.9799 0.9241 0.9511 158 0.9609 0.9919 0.9762 124 0.9595 0.9523 0.9559 0.9901
0.009 84.0 8064 0.0516 0.925 0.9569 0.9407 116 0.9799 0.9241 0.9511 158 0.9609 0.9919 0.9762 124 0.9572 0.9548 0.9560 0.9901
0.0075 85.0 8160 0.0495 0.9167 0.9483 0.9322 116 0.9542 0.9241 0.9389 158 0.9609 0.9919 0.9762 124 0.9451 0.9523 0.9487 0.9896
0.0089 86.0 8256 0.0494 0.9083 0.9397 0.9237 116 0.9669 0.9241 0.9450 158 0.9609 0.9919 0.9762 124 0.9474 0.9497 0.9486 0.9890
0.0087 87.0 8352 0.0518 0.9167 0.9483 0.9322 116 0.9481 0.9241 0.9359 158 0.9531 0.9839 0.9683 124 0.9403 0.9497 0.9450 0.9893
0.0096 88.0 8448 0.0470 0.9083 0.9397 0.9237 116 0.96 0.9114 0.9351 158 0.9531 0.9839 0.9683 124 0.9422 0.9422 0.9422 0.9890
0.0081 89.0 8544 0.0464 0.9167 0.9483 0.9322 116 0.9481 0.9241 0.9359 158 0.9531 0.9839 0.9683 124 0.9403 0.9497 0.9450 0.9904
0.0083 90.0 8640 0.0468 0.9167 0.9483 0.9322 116 0.9605 0.9241 0.9419 158 0.9531 0.9839 0.9683 124 0.945 0.9497 0.9474 0.9896
0.0079 91.0 8736 0.0455 0.9153 0.9310 0.9231 116 0.9427 0.9367 0.9397 158 0.9531 0.9839 0.9683 124 0.9380 0.9497 0.9438 0.9898
0.0086 92.0 8832 0.0436 0.925 0.9569 0.9407 116 0.9608 0.9304 0.9453 158 0.9685 0.9919 0.9801 124 0.9525 0.9573 0.9549 0.9912
0.0079 93.0 8928 0.0450 0.9244 0.9483 0.9362 116 0.9669 0.9241 0.9450 158 0.9531 0.9839 0.9683 124 0.9497 0.9497 0.9497 0.9901
0.0083 94.0 9024 0.0442 0.9167 0.9483 0.9322 116 0.9545 0.9304 0.9423 158 0.9609 0.9919 0.9762 124 0.9453 0.9548 0.95 0.9907
0.0076 95.0 9120 0.0451 0.9244 0.9483 0.9362 116 0.9671 0.9304 0.9484 158 0.9609 0.9919 0.9762 124 0.9524 0.9548 0.9536 0.9904
0.0072 96.0 9216 0.0459 0.9328 0.9569 0.9447 116 0.9735 0.9304 0.9515 158 0.9685 0.9919 0.9801 124 0.9597 0.9573 0.9585 0.9909
0.0072 97.0 9312 0.0449 0.9244 0.9483 0.9362 116 0.9671 0.9304 0.9484 158 0.9609 0.9919 0.9762 124 0.9524 0.9548 0.9536 0.9907
0.0059 98.0 9408 0.0450 0.9244 0.9483 0.9362 116 0.9671 0.9304 0.9484 158 0.9609 0.9919 0.9762 124 0.9524 0.9548 0.9536 0.9909
0.0081 99.0 9504 0.0446 0.9244 0.9483 0.9362 116 0.9671 0.9304 0.9484 158 0.9609 0.9919 0.9762 124 0.9524 0.9548 0.9536 0.9909
0.0074 100.0 9600 0.0446 0.9244 0.9483 0.9362 116 0.9671 0.9304 0.9484 158 0.9609 0.9919 0.9762 124 0.9524 0.9548 0.9536 0.9907

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-pt-pl10-1

Finetuned
(366)
this model