Edit model card

nerui-pt-pl10-2

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0689
  • Location Precision: 0.8889
  • Location Recall: 0.9462
  • Location F1: 0.9167
  • Location Number: 93
  • Organization Precision: 0.9157
  • Organization Recall: 0.9157
  • Organization F1: 0.9157
  • Organization Number: 166
  • Person Precision: 0.9787
  • Person Recall: 0.9718
  • Person F1: 0.9753
  • Person Number: 142
  • Overall Precision: 0.9310
  • Overall Recall: 0.9426
  • Overall F1: 0.9368
  • Overall Accuracy: 0.9877

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8608 1.0 96 0.3943 0.25 0.0215 0.0396 93 0.2151 0.2410 0.2273 166 0.2889 0.2746 0.2816 142 0.2462 0.2020 0.2219 0.8650
0.3625 2.0 192 0.2152 0.3333 0.5161 0.4051 93 0.6358 0.5783 0.6057 166 0.5864 0.7887 0.6727 142 0.5267 0.6384 0.5772 0.9405
0.1958 3.0 288 0.0969 0.8333 0.7527 0.7910 93 0.6734 0.8072 0.7342 166 0.9459 0.9859 0.9655 142 0.7981 0.8579 0.8269 0.9676
0.1348 4.0 384 0.0837 0.6842 0.8387 0.7536 93 0.7436 0.8735 0.8033 166 0.9722 0.9859 0.9790 142 0.8013 0.9052 0.8501 0.9712
0.1081 5.0 480 0.0644 0.8778 0.8495 0.8634 93 0.8343 0.8795 0.8563 166 0.9718 0.9718 0.9718 142 0.8919 0.9052 0.8985 0.9808
0.0952 6.0 576 0.0537 0.75 0.9032 0.8195 93 0.8606 0.8554 0.8580 166 0.9858 0.9789 0.9823 142 0.8732 0.9102 0.8913 0.9824
0.0875 7.0 672 0.0596 0.82 0.8817 0.8497 93 0.8432 0.9398 0.8889 166 0.9789 0.9789 0.9789 142 0.8829 0.9401 0.9106 0.9816
0.073 8.0 768 0.0538 0.8365 0.9355 0.8832 93 0.88 0.9277 0.9032 166 0.9789 0.9789 0.9789 142 0.9026 0.9476 0.9246 0.9835
0.0696 9.0 864 0.0475 0.8673 0.9140 0.8901 93 0.8817 0.8976 0.8896 166 0.9718 0.9718 0.9718 142 0.9095 0.9277 0.9185 0.9857
0.064 10.0 960 0.0520 0.89 0.9570 0.9223 93 0.9264 0.9096 0.9179 166 0.9720 0.9789 0.9754 142 0.9335 0.9451 0.9393 0.9868
0.0626 11.0 1056 0.0440 0.8812 0.9570 0.9175 93 0.9371 0.8976 0.9169 166 0.9650 0.9718 0.9684 142 0.9330 0.9377 0.9353 0.9879
0.0554 12.0 1152 0.0455 0.9140 0.9140 0.9140 93 0.8851 0.9277 0.9059 166 0.9720 0.9789 0.9754 142 0.9220 0.9426 0.9322 0.9871
0.0548 13.0 1248 0.0483 0.9149 0.9247 0.9198 93 0.9181 0.9458 0.9318 166 0.9720 0.9789 0.9754 142 0.9363 0.9526 0.9444 0.9885
0.0504 14.0 1344 0.0444 0.9247 0.9247 0.9247 93 0.9405 0.9518 0.9461 166 0.9650 0.9718 0.9684 142 0.9455 0.9526 0.9491 0.9896
0.0494 15.0 1440 0.0427 0.9355 0.9355 0.9355 93 0.9181 0.9458 0.9318 166 0.9720 0.9789 0.9754 142 0.9410 0.9551 0.9480 0.9888
0.0445 16.0 1536 0.0492 0.8476 0.9570 0.8990 93 0.9152 0.9096 0.9124 166 0.9650 0.9718 0.9684 142 0.9153 0.9426 0.9287 0.9874
0.0457 17.0 1632 0.0445 0.8990 0.9570 0.9271 93 0.9398 0.9398 0.9398 166 0.9789 0.9789 0.9789 142 0.9435 0.9576 0.9505 0.9890
0.0433 18.0 1728 0.0500 0.8990 0.9570 0.9271 93 0.9333 0.9277 0.9305 166 0.9789 0.9789 0.9789 142 0.9409 0.9526 0.9467 0.9896
0.0395 19.0 1824 0.0490 0.9255 0.9355 0.9305 93 0.9176 0.9398 0.9286 166 0.9789 0.9789 0.9789 142 0.9409 0.9526 0.9467 0.9890
0.0375 20.0 1920 0.0494 0.9072 0.9462 0.9263 93 0.9128 0.9458 0.9290 166 0.9789 0.9789 0.9789 142 0.9343 0.9576 0.9458 0.9879
0.0394 21.0 2016 0.0503 0.9082 0.9570 0.9319 93 0.9226 0.9337 0.9281 166 0.9718 0.9718 0.9718 142 0.9363 0.9526 0.9444 0.9888
0.0364 22.0 2112 0.0498 0.8980 0.9462 0.9215 93 0.9017 0.9398 0.9204 166 0.9789 0.9789 0.9789 142 0.9274 0.9551 0.9410 0.9879
0.035 23.0 2208 0.0497 0.8878 0.9355 0.9110 93 0.9222 0.9277 0.9249 166 0.9718 0.9718 0.9718 142 0.9312 0.9451 0.9381 0.9877
0.0355 24.0 2304 0.0449 0.9263 0.9462 0.9362 93 0.9070 0.9398 0.9231 166 0.9858 0.9789 0.9823 142 0.9387 0.9551 0.9468 0.9879
0.0316 25.0 2400 0.0469 0.8725 0.9570 0.9128 93 0.9198 0.8976 0.9085 166 0.9718 0.9718 0.9718 142 0.9261 0.9377 0.9318 0.9879
0.0309 26.0 2496 0.0457 0.9355 0.9355 0.9355 93 0.9337 0.9337 0.9337 166 0.9650 0.9718 0.9684 142 0.9453 0.9476 0.9465 0.9896
0.0293 27.0 2592 0.0474 0.8889 0.9462 0.9167 93 0.9176 0.9398 0.9286 166 0.9720 0.9789 0.9754 142 0.9296 0.9551 0.9422 0.9877
0.0299 28.0 2688 0.0501 0.9278 0.9677 0.9474 93 0.9017 0.9398 0.9204 166 0.9718 0.9718 0.9718 142 0.9320 0.9576 0.9446 0.9882
0.0264 29.0 2784 0.0472 0.8812 0.9570 0.9175 93 0.9226 0.9337 0.9281 166 0.9789 0.9789 0.9789 142 0.9319 0.9551 0.9433 0.9882
0.0257 30.0 2880 0.0502 0.9158 0.9355 0.9255 93 0.8927 0.9518 0.9213 166 0.9789 0.9789 0.9789 142 0.9275 0.9576 0.9423 0.9877
0.0262 31.0 2976 0.0505 0.88 0.9462 0.9119 93 0.9162 0.9217 0.9189 166 0.9720 0.9789 0.9754 142 0.9268 0.9476 0.9371 0.9874
0.0263 32.0 3072 0.0512 0.9072 0.9462 0.9263 93 0.9379 0.9096 0.9235 166 0.9789 0.9789 0.9789 142 0.945 0.9426 0.9438 0.9885
0.0246 33.0 3168 0.0493 0.9263 0.9462 0.9362 93 0.9070 0.9398 0.9231 166 0.9787 0.9718 0.9753 142 0.9363 0.9526 0.9444 0.9893
0.0242 34.0 3264 0.0489 0.89 0.9570 0.9223 93 0.9107 0.9217 0.9162 166 0.9858 0.9789 0.9823 142 0.9315 0.9501 0.9407 0.9877
0.0217 35.0 3360 0.0516 0.9255 0.9355 0.9305 93 0.8908 0.9337 0.9118 166 0.9648 0.9648 0.9648 142 0.9244 0.9451 0.9346 0.9871
0.0221 36.0 3456 0.0565 0.89 0.9570 0.9223 93 0.9325 0.9157 0.9240 166 0.9718 0.9718 0.9718 142 0.9358 0.9451 0.9404 0.9877
0.0219 37.0 3552 0.0485 0.9082 0.9570 0.9319 93 0.9157 0.9157 0.9157 166 0.9787 0.9718 0.9753 142 0.9358 0.9451 0.9404 0.9893
0.0217 38.0 3648 0.0529 0.9184 0.9677 0.9424 93 0.9212 0.9157 0.9184 166 0.9716 0.9648 0.9682 142 0.9381 0.9451 0.9416 0.9882
0.0236 39.0 3744 0.0504 0.9255 0.9355 0.9305 93 0.9 0.9217 0.9107 166 0.9716 0.9648 0.9682 142 0.9309 0.9401 0.9355 0.9874
0.0205 40.0 3840 0.0534 0.8788 0.9355 0.9062 93 0.9053 0.9217 0.9134 166 0.9580 0.9648 0.9614 142 0.9173 0.9401 0.9286 0.9877
0.0193 41.0 3936 0.0645 0.8713 0.9462 0.9072 93 0.8772 0.9036 0.8902 166 0.9857 0.9718 0.9787 142 0.9126 0.9377 0.9250 0.9860
0.0198 42.0 4032 0.0561 0.89 0.9570 0.9223 93 0.8922 0.8976 0.8949 166 0.9858 0.9789 0.9823 142 0.9240 0.9401 0.9320 0.9860
0.018 43.0 4128 0.0540 0.9175 0.9570 0.9368 93 0.9152 0.9096 0.9124 166 0.9787 0.9718 0.9753 142 0.9380 0.9426 0.9403 0.9882
0.0188 44.0 4224 0.0571 0.8878 0.9355 0.9110 93 0.8976 0.8976 0.8976 166 0.9789 0.9789 0.9789 142 0.9236 0.9352 0.9294 0.9868
0.0177 45.0 4320 0.0529 0.9072 0.9462 0.9263 93 0.9085 0.8976 0.9030 166 0.9789 0.9789 0.9789 142 0.9330 0.9377 0.9353 0.9863
0.0178 46.0 4416 0.0539 0.8585 0.9785 0.9146 93 0.9062 0.8735 0.8896 166 0.9858 0.9789 0.9823 142 0.9214 0.9352 0.9282 0.9874
0.0168 47.0 4512 0.0545 0.9082 0.9570 0.9319 93 0.9059 0.9277 0.9167 166 0.9858 0.9789 0.9823 142 0.9340 0.9526 0.9432 0.9874
0.016 48.0 4608 0.0537 0.8990 0.9570 0.9271 93 0.9096 0.9096 0.9096 166 0.9858 0.9789 0.9823 142 0.9335 0.9451 0.9393 0.9885
0.0167 49.0 4704 0.0507 0.9082 0.9570 0.9319 93 0.9277 0.9277 0.9277 166 0.9858 0.9789 0.9823 142 0.9432 0.9526 0.9479 0.9893
0.0157 50.0 4800 0.0586 0.8878 0.9355 0.9110 93 0.8947 0.9217 0.9080 166 0.9858 0.9789 0.9823 142 0.9244 0.9451 0.9346 0.9868
0.015 51.0 4896 0.0524 0.8788 0.9355 0.9062 93 0.9162 0.9217 0.9189 166 0.9718 0.9718 0.9718 142 0.9265 0.9426 0.9345 0.9874
0.0155 52.0 4992 0.0629 0.8627 0.9462 0.9026 93 0.8988 0.9096 0.9042 166 0.9857 0.9718 0.9787 142 0.9195 0.9401 0.9297 0.9863
0.0156 53.0 5088 0.0601 0.87 0.9355 0.9016 93 0.9217 0.9217 0.9217 166 0.9787 0.9718 0.9753 142 0.9287 0.9426 0.9356 0.9868
0.0141 54.0 5184 0.0707 0.8641 0.9570 0.9082 93 0.8902 0.9277 0.9086 166 0.9857 0.9718 0.9787 142 0.9159 0.9501 0.9327 0.9849
0.0133 55.0 5280 0.0633 0.89 0.9570 0.9223 93 0.9259 0.9036 0.9146 166 0.9787 0.9718 0.9753 142 0.9355 0.9401 0.9378 0.9868
0.013 56.0 5376 0.0592 0.8980 0.9462 0.9215 93 0.9217 0.9217 0.9217 166 0.9716 0.9648 0.9682 142 0.9333 0.9426 0.9380 0.9879
0.0143 57.0 5472 0.0567 0.8725 0.9570 0.9128 93 0.9321 0.9096 0.9207 166 0.9789 0.9789 0.9789 142 0.9335 0.9451 0.9393 0.9888
0.0135 58.0 5568 0.0632 0.8812 0.9570 0.9175 93 0.9325 0.9157 0.9240 166 0.9858 0.9789 0.9823 142 0.9383 0.9476 0.9429 0.9877
0.0111 59.0 5664 0.0628 0.9072 0.9462 0.9263 93 0.9317 0.9036 0.9174 166 0.9716 0.9648 0.9682 142 0.9398 0.9352 0.9375 0.9871
0.0108 60.0 5760 0.0597 0.8969 0.9355 0.9158 93 0.9102 0.9157 0.9129 166 0.9716 0.9648 0.9682 142 0.9284 0.9377 0.9330 0.9868
0.0144 61.0 5856 0.0644 0.8889 0.9462 0.9167 93 0.9371 0.8976 0.9169 166 0.9716 0.9648 0.9682 142 0.9373 0.9327 0.935 0.9860
0.0124 62.0 5952 0.0588 0.8878 0.9355 0.9110 93 0.9059 0.9277 0.9167 166 0.9857 0.9718 0.9787 142 0.9289 0.9451 0.9370 0.9871
0.0117 63.0 6048 0.0627 0.8980 0.9462 0.9215 93 0.9226 0.9337 0.9281 166 0.9857 0.9718 0.9787 142 0.9384 0.9501 0.9442 0.9879
0.0101 64.0 6144 0.0624 0.9072 0.9462 0.9263 93 0.9222 0.9277 0.9249 166 0.9857 0.9718 0.9787 142 0.9406 0.9476 0.9441 0.9871
0.0117 65.0 6240 0.0626 0.8889 0.9462 0.9167 93 0.9048 0.9157 0.9102 166 0.9857 0.9718 0.9787 142 0.9287 0.9426 0.9356 0.9871
0.0108 66.0 6336 0.0590 0.8866 0.9247 0.9053 93 0.9264 0.9096 0.9179 166 0.9648 0.9648 0.9648 142 0.9303 0.9327 0.9315 0.9877
0.0104 67.0 6432 0.0625 0.8990 0.9570 0.9271 93 0.9107 0.9217 0.9162 166 0.9857 0.9718 0.9787 142 0.9337 0.9476 0.9406 0.9877
0.0119 68.0 6528 0.0647 0.9082 0.9570 0.9319 93 0.9059 0.9277 0.9167 166 0.9857 0.9718 0.9787 142 0.9338 0.9501 0.9419 0.9874
0.011 69.0 6624 0.0559 0.8980 0.9462 0.9215 93 0.9157 0.9157 0.9157 166 0.9787 0.9718 0.9753 142 0.9333 0.9426 0.9380 0.9888
0.0096 70.0 6720 0.0579 0.8990 0.9570 0.9271 93 0.9102 0.9157 0.9129 166 0.9857 0.9718 0.9787 142 0.9335 0.9451 0.9393 0.9882
0.0103 71.0 6816 0.0602 0.8980 0.9462 0.9215 93 0.8941 0.9157 0.9048 166 0.9857 0.9718 0.9787 142 0.9265 0.9426 0.9345 0.9877
0.0091 72.0 6912 0.0653 0.8990 0.9570 0.9271 93 0.9268 0.9157 0.9212 166 0.9787 0.9718 0.9753 142 0.9381 0.9451 0.9416 0.9877
0.0093 73.0 7008 0.0698 0.8889 0.9462 0.9167 93 0.9096 0.9096 0.9096 166 0.9716 0.9648 0.9682 142 0.9261 0.9377 0.9318 0.9866
0.0091 74.0 7104 0.0661 0.9072 0.9462 0.9263 93 0.8994 0.9157 0.9075 166 0.9787 0.9718 0.9753 142 0.9287 0.9426 0.9356 0.9871
0.0093 75.0 7200 0.0727 0.8788 0.9355 0.9062 93 0.8982 0.9036 0.9009 166 0.9716 0.9648 0.9682 142 0.9189 0.9327 0.9257 0.9855
0.0087 76.0 7296 0.0697 0.89 0.9570 0.9223 93 0.9102 0.9157 0.9129 166 0.9787 0.9718 0.9753 142 0.9289 0.9451 0.9370 0.9868
0.0104 77.0 7392 0.0677 0.88 0.9462 0.9119 93 0.9268 0.9157 0.9212 166 0.9857 0.9718 0.9787 142 0.9356 0.9426 0.9391 0.9879
0.0089 78.0 7488 0.0681 0.8878 0.9355 0.9110 93 0.9042 0.9096 0.9069 166 0.9716 0.9648 0.9682 142 0.9236 0.9352 0.9294 0.9863
0.0092 79.0 7584 0.0689 0.8889 0.9462 0.9167 93 0.8935 0.9096 0.9015 166 0.9857 0.9718 0.9787 142 0.9240 0.9401 0.9320 0.9863
0.0069 80.0 7680 0.0715 0.8889 0.9462 0.9167 93 0.9048 0.9157 0.9102 166 0.9857 0.9718 0.9787 142 0.9287 0.9426 0.9356 0.9871
0.0087 81.0 7776 0.0691 0.8889 0.9462 0.9167 93 0.8994 0.9157 0.9075 166 0.9857 0.9718 0.9787 142 0.9265 0.9426 0.9345 0.9871
0.0081 82.0 7872 0.0735 0.8889 0.9462 0.9167 93 0.9042 0.9096 0.9069 166 0.9857 0.9718 0.9787 142 0.9286 0.9401 0.9343 0.9860
0.0099 83.0 7968 0.0631 0.8980 0.9462 0.9215 93 0.9162 0.9217 0.9189 166 0.9787 0.9718 0.9753 142 0.9335 0.9451 0.9393 0.9882
0.0085 84.0 8064 0.0659 0.8889 0.9462 0.9167 93 0.9212 0.9157 0.9184 166 0.9857 0.9718 0.9787 142 0.9356 0.9426 0.9391 0.9877
0.0092 85.0 8160 0.0663 0.89 0.9570 0.9223 93 0.9217 0.9217 0.9217 166 0.9787 0.9718 0.9753 142 0.9337 0.9476 0.9406 0.9874
0.0076 86.0 8256 0.0662 0.8889 0.9462 0.9167 93 0.9042 0.9096 0.9069 166 0.9857 0.9718 0.9787 142 0.9286 0.9401 0.9343 0.9874
0.0073 87.0 8352 0.0686 0.9 0.9677 0.9326 93 0.9212 0.9157 0.9184 166 0.9857 0.9718 0.9787 142 0.9383 0.9476 0.9429 0.9874
0.0068 88.0 8448 0.0689 0.8889 0.9462 0.9167 93 0.9152 0.9096 0.9124 166 0.9857 0.9718 0.9787 142 0.9332 0.9401 0.9366 0.9871
0.0076 89.0 8544 0.0683 0.8889 0.9462 0.9167 93 0.9157 0.9157 0.9157 166 0.9718 0.9718 0.9718 142 0.9287 0.9426 0.9356 0.9868
0.0071 90.0 8640 0.0642 0.8990 0.9570 0.9271 93 0.9162 0.9217 0.9189 166 0.9857 0.9718 0.9787 142 0.9360 0.9476 0.9418 0.9882
0.0069 91.0 8736 0.0702 0.8889 0.9462 0.9167 93 0.9048 0.9157 0.9102 166 0.9857 0.9718 0.9787 142 0.9287 0.9426 0.9356 0.9871
0.0073 92.0 8832 0.0672 0.8889 0.9462 0.9167 93 0.9157 0.9157 0.9157 166 0.9787 0.9718 0.9753 142 0.9310 0.9426 0.9368 0.9879
0.007 93.0 8928 0.0653 0.8889 0.9462 0.9167 93 0.9157 0.9157 0.9157 166 0.9787 0.9718 0.9753 142 0.9310 0.9426 0.9368 0.9879
0.007 94.0 9024 0.0677 0.8889 0.9462 0.9167 93 0.9157 0.9157 0.9157 166 0.9858 0.9789 0.9823 142 0.9335 0.9451 0.9393 0.9874
0.0077 95.0 9120 0.0693 0.8889 0.9462 0.9167 93 0.9102 0.9157 0.9129 166 0.9718 0.9718 0.9718 142 0.9265 0.9426 0.9345 0.9868
0.0071 96.0 9216 0.0704 0.8889 0.9462 0.9167 93 0.9048 0.9157 0.9102 166 0.9718 0.9718 0.9718 142 0.9242 0.9426 0.9333 0.9871
0.007 97.0 9312 0.0693 0.8889 0.9462 0.9167 93 0.9102 0.9157 0.9129 166 0.9787 0.9718 0.9753 142 0.9287 0.9426 0.9356 0.9877
0.0062 98.0 9408 0.0693 0.8889 0.9462 0.9167 93 0.9157 0.9157 0.9157 166 0.9787 0.9718 0.9753 142 0.9310 0.9426 0.9368 0.9874
0.0062 99.0 9504 0.0690 0.8889 0.9462 0.9167 93 0.9157 0.9157 0.9157 166 0.9787 0.9718 0.9753 142 0.9310 0.9426 0.9368 0.9874
0.0068 100.0 9600 0.0689 0.8889 0.9462 0.9167 93 0.9157 0.9157 0.9157 166 0.9787 0.9718 0.9753 142 0.9310 0.9426 0.9368 0.9877

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-pt-pl10-2

Finetuned
(353)
this model