Edit model card

nerui-pt-pl30-0

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0581
  • Location Precision: 0.8713
  • Location Recall: 0.9362
  • Location F1: 0.9026
  • Location Number: 94
  • Organization Precision: 0.9217
  • Organization Recall: 0.9162
  • Organization F1: 0.9189
  • Organization Number: 167
  • Person Precision: 0.9926
  • Person Recall: 0.9854
  • Person F1: 0.9890
  • Person Number: 137
  • Overall Precision: 0.9330
  • Overall Recall: 0.9447
  • Overall F1: 0.9388
  • Overall Accuracy: 0.9878

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8686 1.0 96 0.4168 0.0 0.0 0.0 94 0.1788 0.1916 0.1850 167 0.2365 0.3504 0.2824 137 0.2083 0.2010 0.2046 0.8588
0.3777 2.0 192 0.2543 0.4839 0.3191 0.3846 94 0.4678 0.6527 0.5450 167 0.6416 0.8102 0.7161 137 0.5342 0.6281 0.5774 0.9215
0.2087 3.0 288 0.1150 0.7404 0.8191 0.7778 94 0.7053 0.8024 0.7507 167 0.9241 0.9781 0.9504 137 0.7859 0.8668 0.8244 0.9652
0.1366 4.0 384 0.0908 0.7456 0.9043 0.8173 94 0.7625 0.7305 0.7462 167 0.9315 0.9927 0.9611 137 0.8167 0.8618 0.8386 0.9691
0.116 5.0 480 0.0718 0.8687 0.9149 0.8912 94 0.8171 0.8024 0.8097 167 0.9507 0.9854 0.9677 137 0.8765 0.8920 0.8842 0.9760
0.0965 6.0 576 0.0544 0.8491 0.9574 0.9 94 0.8820 0.8503 0.8659 167 0.9783 0.9854 0.9818 137 0.9062 0.9221 0.9141 0.9809
0.0853 7.0 672 0.0533 0.7982 0.9681 0.875 94 0.8650 0.8443 0.8545 167 0.9926 0.9781 0.9853 137 0.8883 0.9196 0.9037 0.9809
0.0797 8.0 768 0.0470 0.8585 0.9681 0.91 94 0.8683 0.8683 0.8683 167 0.9853 0.9781 0.9817 137 0.9046 0.9296 0.9170 0.9840
0.0702 9.0 864 0.0443 0.8103 1.0 0.8952 94 0.9091 0.8383 0.8723 167 0.9926 0.9854 0.9890 137 0.9089 0.9271 0.9179 0.9859
0.07 10.0 960 0.0377 0.9583 0.9787 0.9684 94 0.8778 0.9461 0.9107 167 0.9854 0.9854 0.9854 137 0.9322 0.9673 0.9494 0.9884
0.063 11.0 1056 0.0344 0.9020 0.9787 0.9388 94 0.9202 0.8982 0.9091 167 0.9854 0.9854 0.9854 137 0.9378 0.9472 0.9425 0.9898
0.0565 12.0 1152 0.0435 0.8835 0.9681 0.9239 94 0.8935 0.9042 0.8988 167 0.9852 0.9708 0.9779 137 0.9214 0.9422 0.9317 0.9859
0.056 13.0 1248 0.0390 0.8846 0.9787 0.9293 94 0.9299 0.8743 0.9012 167 0.9781 0.9781 0.9781 137 0.9347 0.9347 0.9347 0.9876
0.0516 14.0 1344 0.0421 0.8774 0.9894 0.93 94 0.9187 0.8802 0.8991 167 0.9708 0.9708 0.9708 137 0.9256 0.9372 0.9313 0.9865
0.0485 15.0 1440 0.0335 0.9278 0.9574 0.9424 94 0.9080 0.9461 0.9267 167 0.9710 0.9781 0.9745 137 0.9340 0.9598 0.9467 0.9892
0.046 16.0 1536 0.0352 0.8952 1.0 0.9447 94 0.9551 0.8922 0.9226 167 0.9853 0.9781 0.9817 137 0.9496 0.9472 0.9484 0.9895
0.0448 17.0 1632 0.0377 0.8932 0.9787 0.9340 94 0.9245 0.8802 0.9018 167 0.9853 0.9781 0.9817 137 0.9372 0.9372 0.9372 0.9873
0.0409 18.0 1728 0.0506 0.8198 0.9681 0.8878 94 0.9586 0.8323 0.8910 167 0.9781 0.9781 0.9781 137 0.9262 0.9146 0.9204 0.9834
0.0376 19.0 1824 0.0369 0.8911 0.9574 0.9231 94 0.9620 0.9102 0.9354 167 0.9926 0.9854 0.9890 137 0.9544 0.9472 0.9508 0.9892
0.0382 20.0 1920 0.0413 0.8426 0.9681 0.9010 94 0.9494 0.8982 0.9231 167 0.9854 0.9854 0.9854 137 0.9330 0.9447 0.9388 0.9876
0.0364 21.0 2016 0.0353 0.9010 0.9681 0.9333 94 0.9387 0.9162 0.9273 167 0.9926 0.9854 0.9890 137 0.9475 0.9523 0.9499 0.9895
0.036 22.0 2112 0.0393 0.8932 0.9787 0.9340 94 0.9273 0.9162 0.9217 167 0.9781 0.9781 0.9781 137 0.9358 0.9523 0.9440 0.9884
0.0356 23.0 2208 0.0451 0.875 0.9681 0.9192 94 0.9608 0.8802 0.9188 167 0.9926 0.9854 0.9890 137 0.9491 0.9372 0.9431 0.9870
0.0322 24.0 2304 0.0394 0.8911 0.9574 0.9231 94 0.9441 0.9102 0.9268 167 0.9926 0.9854 0.9890 137 0.9472 0.9472 0.9472 0.9878
0.0342 25.0 2400 0.0412 0.9 0.9574 0.9278 94 0.9064 0.9281 0.9172 167 0.9853 0.9781 0.9817 137 0.9312 0.9523 0.9416 0.9867
0.0309 26.0 2496 0.0488 0.8835 0.9681 0.9239 94 0.9152 0.9042 0.9096 167 0.9926 0.9781 0.9853 137 0.9330 0.9447 0.9388 0.9856
0.0282 27.0 2592 0.0389 0.91 0.9681 0.9381 94 0.9277 0.9222 0.9249 167 0.9926 0.9854 0.9890 137 0.9453 0.9548 0.95 0.9892
0.0267 28.0 2688 0.0472 0.8426 0.9681 0.9010 94 0.9255 0.8922 0.9085 167 0.9926 0.9854 0.9890 137 0.9259 0.9422 0.9340 0.9873
0.0298 29.0 2784 0.0428 0.8585 0.9681 0.91 94 0.9367 0.8862 0.9108 167 0.9926 0.9854 0.9890 137 0.935 0.9397 0.9373 0.9876
0.0262 30.0 2880 0.0428 0.8835 0.9681 0.9239 94 0.9394 0.9281 0.9337 167 0.9926 0.9854 0.9890 137 0.9431 0.9573 0.9501 0.9892
0.0273 31.0 2976 0.0426 0.8932 0.9787 0.9340 94 0.9321 0.9042 0.9179 167 0.9926 0.9854 0.9890 137 0.9426 0.9497 0.9462 0.9892
0.0253 32.0 3072 0.0451 0.8738 0.9574 0.9137 94 0.9085 0.8922 0.9003 167 0.9926 0.9854 0.9890 137 0.9280 0.9397 0.9338 0.9876
0.0265 33.0 3168 0.0524 0.8558 0.9468 0.8990 94 0.8963 0.8802 0.8882 167 0.9853 0.9781 0.9817 137 0.9158 0.9296 0.9227 0.9854
0.0236 34.0 3264 0.0478 0.8713 0.9362 0.9026 94 0.925 0.8862 0.9052 167 0.9926 0.9854 0.9890 137 0.9345 0.9322 0.9333 0.9870
0.0207 35.0 3360 0.0490 0.9091 0.9574 0.9326 94 0.9387 0.9162 0.9273 167 0.9926 0.9854 0.9890 137 0.9497 0.9497 0.9497 0.9887
0.0208 36.0 3456 0.0438 0.8824 0.9574 0.9184 94 0.9325 0.9102 0.9212 167 0.9853 0.9781 0.9817 137 0.9377 0.9447 0.9412 0.9881
0.0217 37.0 3552 0.0479 0.8667 0.9681 0.9146 94 0.9157 0.9102 0.9129 167 0.9926 0.9854 0.9890 137 0.9287 0.9497 0.9391 0.9876
0.0205 38.0 3648 0.0443 0.9 0.9574 0.9278 94 0.9341 0.9341 0.9341 167 0.9926 0.9854 0.9890 137 0.9454 0.9573 0.9513 0.9892
0.02 39.0 3744 0.0443 0.875 0.9681 0.9192 94 0.9317 0.8982 0.9146 167 0.9926 0.9854 0.9890 137 0.9377 0.9447 0.9412 0.9870
0.0196 40.0 3840 0.0395 0.91 0.9681 0.9381 94 0.9329 0.9162 0.9245 167 0.9926 0.9854 0.9890 137 0.9475 0.9523 0.9499 0.9901
0.0193 41.0 3936 0.0481 0.8835 0.9681 0.9239 94 0.9325 0.9102 0.9212 167 0.9926 0.9854 0.9890 137 0.9403 0.9497 0.9450 0.9881
0.0192 42.0 4032 0.0455 0.9010 0.9681 0.9333 94 0.9394 0.9281 0.9337 167 0.9926 0.9854 0.9890 137 0.9478 0.9573 0.9525 0.9895
0.018 43.0 4128 0.0534 0.8738 0.9574 0.9137 94 0.9437 0.9042 0.9235 167 1.0 0.9854 0.9926 137 0.9447 0.9447 0.9447 0.9867
0.0177 44.0 4224 0.0550 0.8585 0.9681 0.91 94 0.9437 0.9042 0.9235 167 0.9926 0.9854 0.9890 137 0.9378 0.9472 0.9425 0.9867
0.0169 45.0 4320 0.0428 0.8812 0.9468 0.9128 94 0.9264 0.9042 0.9152 167 0.9926 0.9854 0.9890 137 0.9375 0.9422 0.9398 0.9895
0.0171 46.0 4416 0.0421 0.8812 0.9468 0.9128 94 0.9212 0.9102 0.9157 167 0.9854 0.9854 0.9854 137 0.9330 0.9447 0.9388 0.9890
0.0165 47.0 4512 0.0455 0.8738 0.9574 0.9137 94 0.9152 0.9042 0.9096 167 0.9926 0.9854 0.9890 137 0.9307 0.9447 0.9377 0.9887
0.0149 48.0 4608 0.0550 0.8544 0.9362 0.8934 94 0.8976 0.8922 0.8949 167 0.9853 0.9781 0.9817 137 0.9160 0.9322 0.9240 0.9859
0.0164 49.0 4704 0.0566 0.8571 0.9574 0.9045 94 0.9264 0.9042 0.9152 167 0.9926 0.9854 0.9890 137 0.9307 0.9447 0.9377 0.9870
0.015 50.0 4800 0.0552 0.8614 0.9255 0.8923 94 0.9202 0.8982 0.9091 167 0.9853 0.9781 0.9817 137 0.9275 0.9322 0.9298 0.9865
0.0152 51.0 4896 0.0557 0.8544 0.9362 0.8934 94 0.9198 0.8922 0.9058 167 0.9853 0.9781 0.9817 137 0.9252 0.9322 0.9287 0.9870
0.0155 52.0 4992 0.0474 0.87 0.9255 0.8969 94 0.9333 0.9222 0.9277 167 0.9926 0.9781 0.9853 137 0.9375 0.9422 0.9398 0.9890
0.0155 53.0 5088 0.0503 0.8713 0.9362 0.9026 94 0.9441 0.9102 0.9268 167 0.9926 0.9854 0.9890 137 0.9422 0.9422 0.9422 0.9878
0.013 54.0 5184 0.0518 0.8713 0.9362 0.9026 94 0.9273 0.9162 0.9217 167 0.9926 0.9854 0.9890 137 0.9353 0.9447 0.94 0.9878
0.0145 55.0 5280 0.0493 0.8911 0.9574 0.9231 94 0.9383 0.9102 0.9240 167 0.9926 0.9854 0.9890 137 0.9449 0.9472 0.9460 0.9887
0.013 56.0 5376 0.0440 0.8812 0.9468 0.9128 94 0.9202 0.8982 0.9091 167 0.9779 0.9708 0.9744 137 0.93 0.9347 0.9323 0.9887
0.0126 57.0 5472 0.0475 0.8725 0.9468 0.9082 94 0.9212 0.9102 0.9157 167 0.9853 0.9781 0.9817 137 0.9305 0.9422 0.9363 0.9876
0.0142 58.0 5568 0.0420 0.88 0.9362 0.9072 94 0.9264 0.9042 0.9152 167 0.9926 0.9854 0.9890 137 0.9373 0.9397 0.9385 0.9887
0.0113 59.0 5664 0.0637 0.8788 0.9255 0.9016 94 0.9048 0.9102 0.9075 167 0.9853 0.9781 0.9817 137 0.9256 0.9372 0.9313 0.9865
0.0121 60.0 5760 0.0527 0.8641 0.9468 0.9036 94 0.9157 0.9102 0.9129 167 0.9926 0.9854 0.9890 137 0.9284 0.9447 0.9365 0.9870
0.0115 61.0 5856 0.0488 0.8776 0.9149 0.8958 94 0.9286 0.9341 0.9313 167 0.9926 0.9854 0.9890 137 0.9378 0.9472 0.9425 0.9881
0.0112 62.0 5952 0.0552 0.8571 0.9574 0.9045 94 0.9329 0.9162 0.9245 167 0.9926 0.9854 0.9890 137 0.9333 0.9497 0.9415 0.9881
0.0102 63.0 6048 0.0470 0.8878 0.9255 0.9062 94 0.9059 0.9222 0.9139 167 0.9926 0.9854 0.9890 137 0.9307 0.9447 0.9377 0.9881
0.0123 64.0 6144 0.0571 0.8544 0.9362 0.8934 94 0.9268 0.9102 0.9184 167 0.9926 0.9854 0.9890 137 0.9305 0.9422 0.9363 0.9876
0.0122 65.0 6240 0.0566 0.8713 0.9362 0.9026 94 0.9273 0.9162 0.9217 167 0.9926 0.9854 0.9890 137 0.9353 0.9447 0.94 0.9876
0.0104 66.0 6336 0.0602 0.8182 0.9574 0.8824 94 0.9484 0.8802 0.9130 167 0.9926 0.9854 0.9890 137 0.9277 0.9347 0.9312 0.9865
0.0107 67.0 6432 0.0545 0.8738 0.9574 0.9137 94 0.9212 0.9102 0.9157 167 0.9926 0.9854 0.9890 137 0.9332 0.9472 0.9401 0.9876
0.0103 68.0 6528 0.0484 0.8911 0.9574 0.9231 94 0.9162 0.9162 0.9162 167 0.9926 0.9854 0.9890 137 0.9356 0.9497 0.9426 0.9884
0.0113 69.0 6624 0.0464 0.8627 0.9362 0.8980 94 0.9207 0.9042 0.9124 167 0.9926 0.9854 0.9890 137 0.9303 0.9397 0.9350 0.9873
0.0101 70.0 6720 0.0502 0.8866 0.9149 0.9005 94 0.9112 0.9222 0.9167 167 0.9926 0.9854 0.9890 137 0.9328 0.9422 0.9375 0.9881
0.0095 71.0 6816 0.0554 0.8641 0.9468 0.9036 94 0.9152 0.9042 0.9096 167 0.9926 0.9854 0.9890 137 0.9282 0.9422 0.9352 0.9876
0.0098 72.0 6912 0.0588 0.8641 0.9468 0.9036 94 0.9333 0.9222 0.9277 167 0.9926 0.9854 0.9890 137 0.9356 0.9497 0.9426 0.9870
0.0095 73.0 7008 0.0593 0.8824 0.9574 0.9184 94 0.9333 0.9222 0.9277 167 0.9926 0.9854 0.9890 137 0.9404 0.9523 0.9463 0.9878
0.0093 74.0 7104 0.0541 0.8544 0.9362 0.8934 94 0.9390 0.9222 0.9305 167 0.9926 0.9854 0.9890 137 0.9355 0.9472 0.9413 0.9881
0.0083 75.0 7200 0.0563 0.8544 0.9362 0.8934 94 0.9333 0.9222 0.9277 167 0.9926 0.9854 0.9890 137 0.9332 0.9472 0.9401 0.9881
0.0093 76.0 7296 0.0562 0.8462 0.9362 0.8889 94 0.9207 0.9042 0.9124 167 0.9926 0.9854 0.9890 137 0.9257 0.9397 0.9327 0.9873
0.0094 77.0 7392 0.0549 0.8571 0.9574 0.9045 94 0.95 0.9102 0.9297 167 0.9926 0.9854 0.9890 137 0.9401 0.9472 0.9437 0.9881
0.0082 78.0 7488 0.0549 0.87 0.9255 0.8969 94 0.9226 0.9281 0.9254 167 0.9926 0.9854 0.9890 137 0.9332 0.9472 0.9401 0.9878
0.0084 79.0 7584 0.0584 0.8627 0.9362 0.8980 94 0.9321 0.9042 0.9179 167 0.9853 0.9781 0.9817 137 0.9325 0.9372 0.9348 0.9878
0.0087 80.0 7680 0.0553 0.8544 0.9362 0.8934 94 0.9506 0.9222 0.9362 167 0.9926 0.9854 0.9890 137 0.9401 0.9472 0.9437 0.9890
0.0084 81.0 7776 0.0600 0.8571 0.9574 0.9045 94 0.9157 0.9102 0.9129 167 0.9853 0.9781 0.9817 137 0.9238 0.9447 0.9342 0.9870
0.0075 82.0 7872 0.0540 0.8788 0.9255 0.9016 94 0.9162 0.9162 0.9162 167 0.9854 0.9854 0.9854 137 0.9305 0.9422 0.9363 0.9878
0.008 83.0 7968 0.0577 0.8641 0.9468 0.9036 94 0.9268 0.9102 0.9184 167 0.9854 0.9854 0.9854 137 0.9307 0.9447 0.9377 0.9878
0.0077 84.0 8064 0.0613 0.8641 0.9468 0.9036 94 0.9222 0.9222 0.9222 167 0.9926 0.9854 0.9890 137 0.9310 0.9497 0.9403 0.9878
0.0061 85.0 8160 0.0596 0.8529 0.9255 0.8878 94 0.9268 0.9102 0.9184 167 0.9853 0.9781 0.9817 137 0.9279 0.9372 0.9325 0.9873
0.0069 86.0 8256 0.0556 0.8476 0.9468 0.8945 94 0.9273 0.9162 0.9217 167 0.9853 0.9781 0.9817 137 0.9261 0.9447 0.9353 0.9878
0.0071 87.0 8352 0.0579 0.8969 0.9255 0.9110 94 0.9181 0.9401 0.9290 167 0.9853 0.9781 0.9817 137 0.9356 0.9497 0.9426 0.9881
0.008 88.0 8448 0.0609 0.8411 0.9574 0.8955 94 0.9259 0.8982 0.9119 167 0.9854 0.9854 0.9854 137 0.9236 0.9422 0.9328 0.9873
0.0072 89.0 8544 0.0579 0.8713 0.9362 0.9026 94 0.9217 0.9162 0.9189 167 0.9853 0.9781 0.9817 137 0.9305 0.9422 0.9363 0.9881
0.0071 90.0 8640 0.0551 0.8738 0.9574 0.9137 94 0.9390 0.9222 0.9305 167 0.9926 0.9854 0.9890 137 0.9404 0.9523 0.9463 0.9892
0.0082 91.0 8736 0.0543 0.8738 0.9574 0.9137 94 0.9333 0.9222 0.9277 167 0.9926 0.9854 0.9890 137 0.9381 0.9523 0.9451 0.9890
0.0076 92.0 8832 0.0563 0.8738 0.9574 0.9137 94 0.9277 0.9222 0.9249 167 0.9926 0.9854 0.9890 137 0.9358 0.9523 0.9440 0.9887
0.0067 93.0 8928 0.0573 0.8812 0.9468 0.9128 94 0.9281 0.9281 0.9281 167 0.9926 0.9854 0.9890 137 0.9381 0.9523 0.9451 0.9884
0.0062 94.0 9024 0.0569 0.8654 0.9574 0.9091 94 0.9333 0.9222 0.9277 167 0.9926 0.9854 0.9890 137 0.9358 0.9523 0.9440 0.9887
0.0077 95.0 9120 0.0549 0.8725 0.9468 0.9082 94 0.9333 0.9222 0.9277 167 0.9926 0.9854 0.9890 137 0.9380 0.9497 0.9438 0.9890
0.0072 96.0 9216 0.0581 0.8713 0.9362 0.9026 94 0.9222 0.9222 0.9222 167 0.9926 0.9854 0.9890 137 0.9332 0.9472 0.9401 0.9878
0.0067 97.0 9312 0.0567 0.8725 0.9468 0.9082 94 0.9217 0.9162 0.9189 167 0.9854 0.9854 0.9854 137 0.9309 0.9472 0.9390 0.9881
0.0071 98.0 9408 0.0572 0.88 0.9362 0.9072 94 0.9277 0.9222 0.9249 167 0.9926 0.9854 0.9890 137 0.9378 0.9472 0.9425 0.9884
0.0063 99.0 9504 0.0579 0.8713 0.9362 0.9026 94 0.9217 0.9162 0.9189 167 0.9926 0.9854 0.9890 137 0.9330 0.9447 0.9388 0.9878
0.0073 100.0 9600 0.0581 0.8713 0.9362 0.9026 94 0.9217 0.9162 0.9189 167 0.9926 0.9854 0.9890 137 0.9330 0.9447 0.9388 0.9878

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-pt-pl30-0

Finetuned
(366)
this model